[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 9733 1726773054.06732: starting run ansible-playbook [core 2.16.11] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-EI7 executable location = /usr/local/bin/ansible-playbook python version = 3.12.1 (main, Feb 21 2024, 14:18:26) [GCC 8.5.0 20210514 (Red Hat 8.5.0-21)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 9733 1726773054.07035: Added group all to inventory 9733 1726773054.07037: Added group ungrouped to inventory 9733 1726773054.07040: Group all now contains ungrouped 9733 1726773054.07042: Examining possible inventory source: /tmp/kernel_settings-PVh/inventory.yml 9733 1726773054.16015: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 9733 1726773054.16059: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 9733 1726773054.16077: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 9733 1726773054.16120: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 9733 1726773054.16166: Loaded config def from plugin (inventory/script) 9733 1726773054.16170: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 9733 1726773054.16199: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 9733 1726773054.16257: Loaded config def from plugin (inventory/yaml) 9733 1726773054.16259: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 9733 1726773054.16321: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 9733 1726773054.16606: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 9733 1726773054.16609: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 9733 1726773054.16611: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 9733 1726773054.16616: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 9733 1726773054.16619: Loading data from /tmp/kernel_settings-PVh/inventory.yml 9733 1726773054.16662: /tmp/kernel_settings-PVh/inventory.yml was not parsable by auto 9733 1726773054.16708: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 9733 1726773054.16736: Loading data from /tmp/kernel_settings-PVh/inventory.yml 9733 1726773054.16796: group all already in inventory 9733 1726773054.16801: set inventory_file for managed_node1 9733 1726773054.16804: set inventory_dir for managed_node1 9733 1726773054.16804: Added host managed_node1 to inventory 9733 1726773054.16806: Added host managed_node1 to group all 9733 1726773054.16806: set ansible_host for managed_node1 9733 1726773054.16807: set ansible_ssh_extra_args for managed_node1 9733 1726773054.16809: set inventory_file for managed_node2 9733 1726773054.16810: set inventory_dir for managed_node2 9733 1726773054.16811: Added host managed_node2 to inventory 9733 1726773054.16811: Added host managed_node2 to group all 9733 1726773054.16812: set ansible_host for managed_node2 9733 1726773054.16812: set ansible_ssh_extra_args for managed_node2 9733 1726773054.16814: set inventory_file for managed_node3 9733 1726773054.16815: set inventory_dir for managed_node3 9733 1726773054.16815: Added host managed_node3 to inventory 9733 1726773054.16816: Added host managed_node3 to group all 9733 1726773054.16817: set ansible_host for managed_node3 9733 1726773054.16817: set ansible_ssh_extra_args for managed_node3 9733 1726773054.16819: Reconcile groups and hosts in inventory. 9733 1726773054.16821: Group ungrouped now contains managed_node1 9733 1726773054.16822: Group ungrouped now contains managed_node2 9733 1726773054.16823: Group ungrouped now contains managed_node3 9733 1726773054.16881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 9733 1726773054.16962: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 9733 1726773054.17000: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 9733 1726773054.17018: Loaded config def from plugin (vars/host_group_vars) 9733 1726773054.17019: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 9733 1726773054.17024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 9733 1726773054.17029: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 9733 1726773054.17056: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 9733 1726773054.17299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773054.17364: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 9733 1726773054.17392: Loaded config def from plugin (connection/local) 9733 1726773054.17394: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 9733 1726773054.17729: Loaded config def from plugin (connection/paramiko_ssh) 9733 1726773054.17731: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 9733 1726773054.18334: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 9733 1726773054.18357: Loaded config def from plugin (connection/psrp) 9733 1726773054.18359: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 9733 1726773054.18790: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 9733 1726773054.18813: Loaded config def from plugin (connection/ssh) 9733 1726773054.18815: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 9733 1726773054.20049: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 9733 1726773054.20075: Loaded config def from plugin (connection/winrm) 9733 1726773054.20077: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 9733 1726773054.20102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 9733 1726773054.20148: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 9733 1726773054.20191: Loaded config def from plugin (shell/cmd) 9733 1726773054.20192: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 9733 1726773054.20209: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 9733 1726773054.20248: Loaded config def from plugin (shell/powershell) 9733 1726773054.20249: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 9733 1726773054.20291: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 9733 1726773054.20397: Loaded config def from plugin (shell/sh) 9733 1726773054.20399: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 9733 1726773054.20423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 9733 1726773054.20503: Loaded config def from plugin (become/runas) 9733 1726773054.20508: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 9733 1726773054.20720: Loaded config def from plugin (become/su) 9733 1726773054.20722: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 9733 1726773054.20883: Loaded config def from plugin (become/sudo) 9733 1726773054.20888: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 9733 1726773054.20925: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_simple_settings.yml 9733 1726773054.21471: trying /usr/local/lib/python3.12/site-packages/ansible/modules 9733 1726773054.23595: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 9733 1726773054.23841: in VariableManager get_vars() 9733 1726773054.23861: done with get_vars() 9733 1726773054.23903: in VariableManager get_vars() 9733 1726773054.23918: done with get_vars() 9733 1726773054.24120: in VariableManager get_vars() 9733 1726773054.24133: done with get_vars() 9733 1726773054.24276: in VariableManager get_vars() 9733 1726773054.24289: done with get_vars() 9733 1726773054.24334: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 9733 1726773054.24348: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 9733 1726773054.24584: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 9733 1726773054.24750: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 9733 1726773054.24753: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-EI7/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 9733 1726773054.24790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 9733 1726773054.24812: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 9733 1726773054.24980: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 9733 1726773054.25045: Loaded config def from plugin (callback/default) 9733 1726773054.25047: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 9733 1726773054.26257: Loaded config def from plugin (callback/junit) 9733 1726773054.26260: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 9733 1726773054.26309: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 9733 1726773054.26378: Loaded config def from plugin (callback/minimal) 9733 1726773054.26381: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 9733 1726773054.26422: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 9733 1726773054.26700: Loaded config def from plugin (callback/tree) 9733 1726773054.26703: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 9733 1726773054.26833: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 9733 1726773054.26835: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-EI7/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_simple_settings.yml ******************************************** 1 plays in /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_simple_settings.yml 9733 1726773054.26862: in VariableManager get_vars() 9733 1726773054.26877: done with get_vars() 9733 1726773054.26884: in VariableManager get_vars() 9733 1726773054.26894: done with get_vars() 9733 1726773054.26898: variable 'omit' from source: magic vars 9733 1726773054.26937: in VariableManager get_vars() 9733 1726773054.26951: done with get_vars() 9733 1726773054.26973: variable 'omit' from source: magic vars PLAY [Test simple kernel settings] ********************************************* 9733 1726773054.27574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 9733 1726773054.27650: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 9733 1726773054.27694: getting the remaining hosts for this loop 9733 1726773054.27700: done getting the remaining hosts for this loop 9733 1726773054.27704: getting the next task for host managed_node3 9733 1726773054.27708: done getting next task for host managed_node3 9733 1726773054.27709: ^ task is: TASK: Gathering Facts 9733 1726773054.27711: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773054.27713: getting variables 9733 1726773054.27714: in VariableManager get_vars() 9733 1726773054.27724: Calling all_inventory to load vars for managed_node3 9733 1726773054.27746: Calling groups_inventory to load vars for managed_node3 9733 1726773054.27750: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773054.27763: Calling all_plugins_play to load vars for managed_node3 9733 1726773054.27775: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773054.27779: Calling groups_plugins_play to load vars for managed_node3 9733 1726773054.27815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773054.27865: done with get_vars() 9733 1726773054.27871: done getting variables 9733 1726773054.27944: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_simple_settings.yml:2 Thursday 19 September 2024 15:10:54 -0400 (0:00:00.012) 0:00:00.012 **** 9733 1726773054.27966: entering _queue_task() for managed_node3/gather_facts 9733 1726773054.27968: Creating lock for gather_facts 9733 1726773054.28250: worker is 1 (out of 1 available) 9733 1726773054.28260: exiting _queue_task() for managed_node3/gather_facts 9733 1726773054.28274: done queuing things up, now waiting for results queue to drain 9733 1726773054.28276: waiting for pending results... 9743 1726773054.28499: running TaskExecutor() for managed_node3/TASK: Gathering Facts 9743 1726773054.28618: in run() - task 0affffe7-6841-7dd6-8fa6-000000000014 9743 1726773054.28637: variable 'ansible_search_path' from source: unknown 9743 1726773054.28670: calling self._execute() 9743 1726773054.28729: variable 'ansible_host' from source: host vars for 'managed_node3' 9743 1726773054.28739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9743 1726773054.28749: variable 'omit' from source: magic vars 9743 1726773054.28843: variable 'omit' from source: magic vars 9743 1726773054.28870: variable 'omit' from source: magic vars 9743 1726773054.28905: variable 'omit' from source: magic vars 9743 1726773054.28949: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9743 1726773054.28982: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9743 1726773054.29005: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9743 1726773054.29023: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9743 1726773054.29035: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9743 1726773054.29061: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9743 1726773054.29065: variable 'ansible_host' from source: host vars for 'managed_node3' 9743 1726773054.29069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9743 1726773054.29156: Set connection var ansible_timeout to 10 9743 1726773054.29162: Set connection var ansible_shell_type to sh 9743 1726773054.29167: Set connection var ansible_module_compression to ZIP_DEFLATED 9743 1726773054.29172: Set connection var ansible_shell_executable to /bin/sh 9743 1726773054.29177: Set connection var ansible_pipelining to False 9743 1726773054.29183: Set connection var ansible_connection to ssh 9743 1726773054.29202: variable 'ansible_shell_executable' from source: unknown 9743 1726773054.29205: variable 'ansible_connection' from source: unknown 9743 1726773054.29208: variable 'ansible_module_compression' from source: unknown 9743 1726773054.29210: variable 'ansible_shell_type' from source: unknown 9743 1726773054.29213: variable 'ansible_shell_executable' from source: unknown 9743 1726773054.29215: variable 'ansible_host' from source: host vars for 'managed_node3' 9743 1726773054.29218: variable 'ansible_pipelining' from source: unknown 9743 1726773054.29220: variable 'ansible_timeout' from source: unknown 9743 1726773054.29223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9743 1726773054.29341: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=True, class_only=False) 9743 1726773054.29351: variable 'omit' from source: magic vars 9743 1726773054.29355: starting attempt loop 9743 1726773054.29357: running the handler 9743 1726773054.29367: variable 'ansible_facts' from source: unknown 9743 1726773054.29380: _low_level_execute_command(): starting 9743 1726773054.29496: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9743 1726773054.32494: stdout chunk (state=2): >>>/root <<< 9743 1726773054.32761: stderr chunk (state=3): >>><<< 9743 1726773054.32773: stdout chunk (state=3): >>><<< 9743 1726773054.32799: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9743 1726773054.32814: _low_level_execute_command(): starting 9743 1726773054.32821: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773054.328082-9743-182808520018783 `" && echo ansible-tmp-1726773054.328082-9743-182808520018783="` echo /root/.ansible/tmp/ansible-tmp-1726773054.328082-9743-182808520018783 `" ) && sleep 0' 9743 1726773054.36358: stdout chunk (state=2): >>>ansible-tmp-1726773054.328082-9743-182808520018783=/root/.ansible/tmp/ansible-tmp-1726773054.328082-9743-182808520018783 <<< 9743 1726773054.36444: stderr chunk (state=3): >>><<< 9743 1726773054.36452: stdout chunk (state=3): >>><<< 9743 1726773054.36473: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773054.328082-9743-182808520018783=/root/.ansible/tmp/ansible-tmp-1726773054.328082-9743-182808520018783 , stderr= 9743 1726773054.36504: variable 'ansible_module_compression' from source: unknown 9743 1726773054.36559: ANSIBALLZ: Using generic lock for ansible.legacy.setup 9743 1726773054.36564: ANSIBALLZ: Acquiring lock 9743 1726773054.36570: ANSIBALLZ: Lock acquired: 139792132305312 9743 1726773054.36574: ANSIBALLZ: Creating module 9743 1726773054.62783: ANSIBALLZ: Writing module into payload 9743 1726773054.62901: ANSIBALLZ: Writing module 9743 1726773054.62925: ANSIBALLZ: Renaming module 9743 1726773054.62932: ANSIBALLZ: Done creating module 9743 1726773054.62960: variable 'ansible_facts' from source: unknown 9743 1726773054.62966: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9743 1726773054.62975: _low_level_execute_command(): starting 9743 1726773054.62982: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'python3.6'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'/usr/libexec/platform-python'"'"'; command -v '"'"'python2.7'"'"'; command -v '"'"'/usr/bin/python'"'"'; command -v '"'"'python'"'"'; echo ENDFOUND && sleep 0' 9743 1726773054.65321: stdout chunk (state=2): >>>PLATFORM <<< 9743 1726773054.65383: stdout chunk (state=3): >>>Linux <<< 9743 1726773054.65397: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 <<< 9743 1726773054.65419: stdout chunk (state=3): >>>/usr/bin/python3.6 /usr/bin/python3 /usr/libexec/platform-python ENDFOUND <<< 9743 1726773054.65572: stderr chunk (state=3): >>><<< 9743 1726773054.65588: stdout chunk (state=3): >>><<< 9743 1726773054.65599: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3.6 /usr/bin/python3 /usr/libexec/platform-python ENDFOUND , stderr= 9743 1726773054.65606 [managed_node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3.6', '/usr/bin/python3', '/usr/libexec/platform-python'] 9743 1726773054.65638: _low_level_execute_command(): starting 9743 1726773054.65645: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 9743 1726773054.65722: Sending initial data 9743 1726773054.65729: Sent initial data (1234 bytes) 9743 1726773054.69707: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"8\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"8\"\nPLATFORM_ID=\"platform:el8\"\nPRETTY_NAME=\"CentOS Stream 8\"\nANSI_COLOR=\"0;31\"\nCPE_NAME=\"cpe:/o:centos:centos:8\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 8\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 9743 1726773054.70174: stderr chunk (state=3): >>><<< 9743 1726773054.70182: stdout chunk (state=3): >>><<< 9743 1726773054.70198: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"8\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"8\"\nPLATFORM_ID=\"platform:el8\"\nPRETTY_NAME=\"CentOS Stream 8\"\nANSI_COLOR=\"0;31\"\nCPE_NAME=\"cpe:/o:centos:centos:8\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 8\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr= 9743 1726773054.70246: variable 'ansible_facts' from source: unknown 9743 1726773054.70252: variable 'ansible_facts' from source: unknown 9743 1726773054.70261: variable 'ansible_module_compression' from source: unknown 9743 1726773054.70299: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 9743 1726773054.70323: variable 'ansible_facts' from source: unknown 9743 1726773054.70470: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773054.328082-9743-182808520018783/AnsiballZ_setup.py 9743 1726773054.70580: Sending initial data 9743 1726773054.70589: Sent initial data (151 bytes) 9743 1726773054.73238: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmppvw021gx /root/.ansible/tmp/ansible-tmp-1726773054.328082-9743-182808520018783/AnsiballZ_setup.py <<< 9743 1726773054.75473: stderr chunk (state=3): >>><<< 9743 1726773054.75483: stdout chunk (state=3): >>><<< 9743 1726773054.75507: done transferring module to remote 9743 1726773054.75518: _low_level_execute_command(): starting 9743 1726773054.75524: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773054.328082-9743-182808520018783/ /root/.ansible/tmp/ansible-tmp-1726773054.328082-9743-182808520018783/AnsiballZ_setup.py && sleep 0' 9743 1726773054.78548: stderr chunk (state=2): >>><<< 9743 1726773054.78558: stdout chunk (state=2): >>><<< 9743 1726773054.78578: _low_level_execute_command() done: rc=0, stdout=, stderr= 9743 1726773054.78583: _low_level_execute_command(): starting 9743 1726773054.78591: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773054.328082-9743-182808520018783/AnsiballZ_setup.py && sleep 0' 9743 1726773055.33337: stdout chunk (state=2): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "4.18.0-553.5.1.el8.x86_64", "ansible_kernel_version": "#1 SMP Tue May 21 05:46:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.6.8", "ansible_fqdn": "ip-10-31-47-99.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-47-99", "ansible_nodename": "ip-10-31-47-99.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "08bd8a2f207a4b8ca89af5a29ce2e4fa", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Thursday", "weekday_number": "4", "weeknumber": "38", "day": "19", "hour": "15", "minute": "10", "second": "55", "epoch": "1726773055", "epoch_int": "1726773055", "date": "2024-09-19", "time": "15:10:55", "iso8601_micro": "2024-09-19T19:10:55.063413Z", "iso8601": "2024-09-19T19:10:55Z", "iso8601_basic": "20240919T151055063413", "iso8601_basic_short": "20240919T151055", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANq2lB8EQsesuM3VrW5PfnDEMyZj3K+FFHiJf3WGGIKsYitYJGRKmjyQ+t+FN8efSaWyATBrW2J9UQQu6foxmIbeQD+JUrANokQrzz7Sa7PkBjVEEeJuID0kF73Y+wTPKLeGyyGbF8ST7/mIirwMLgNF9rYjJSTlSR9s6ngk6a09AAAAFQDpx7XaB+Qw4TNJdgutuLdKJ<<< 9743 1726773055.33363: stdout chunk (state=3): >>>DnHBQAAAIEAilO02VLThxnGAc9LGF87asL3RLULaVv1vX/07fPx9WJ7lXgCnIe/2pOE58lypdix9BaiRgz0vvJcEGj7jRMTYpqrC4Xmmis6YlGAtU03ynSa96bqHTpYE2RQXQm1IhDS8UHCV50LWC6KkQium4FH88Kk0JFNiP+2D9oZgqxp3XcAAACBAKKbrhxDgjoiAGcG0Q6pr/zv5mEt39DpWF2V0oHuRo++LCEEo+1kRmNTQcvSOF5B6WX7ajelMNb5uEpe/ehylQ/0gRV4Hrqze8Lek31towkWtZpMM5h1eE3gsXao625iSUUU57IOM0ssRiarIQWz0PydCw02F1gc8RLJz1/tQaUd", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDwbGRzV+PLJzG8aSAcqOY8RTd9sTHhGDA/Nzd+VP1W1/P6NLxRuaQowVrHflI0lMtLCzYSRIWwO1UTRON0/nrIoejGiNd97mckGaK1FR3Ps9PXkYaIq9kT7bf4y2kEDiJ/eWALKqlLuY2lsiskmaPRkAqd7u7fIAU/1RNphI7syU1vQcbZZz1ygKXXg5/yn5kYvHX3p8zy6jZY1gofkNJFv3F2dFDojQbUWAL1X3Rj5XINn7lVsy4T1bRjKQbPVtAB482E73zFS5xFPXvW5/Olly46R30LQztq/sOSMmmNCgFEafBsy8xwYw0zY3pTYbPf2TGIeXO0KMwxYV6aqbAuyzBqHLUWoiUA6wyx/CnVFlGv5cdYwAu+LTM3eZv0nSlVFeJvVnzH2oyyIj4OZCq37QCKWn4to7agNegKwEGvcMdU3v1Pm0FdCJyKrxVVs78XVRgcHoIhCP2NBxqExcpsNhNtxCoiXfQ8gNlLk4g2ID9T6qXej5EKEbM5+UOnVVU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBELSvfJvqPJMYQwTTk5JfFg5rIIsSQutxxyNg6uPgYcegQfl4NFd7KUg0S8GYsFMKgA4UDc2kjRfzDhg5BMkN2Q=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOAyn0xVKMIeJv8cnuXIonl9H3rUCR6uyGL4mdOW4gcQ", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "8", "ansible_distribution_major_version": "8", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3539, "ansible_memfree_mb": 2699, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3539, "used": 840, "free": 2699}, "nocache": {"free": 3297, "used": 242}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ad592-3385-4c88-65df-904ee15ade02", "ansible_product_uuid": "ec2ad592-3385-4c88-65df-904ee15ade02", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "0", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["fe591198-9082-4b15-9b62-e83518524cd2"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2", "holders": []}}, "rotati<<< 9743 1726773055.33398: stdout chunk (state=3): >>>onal": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["fe591198-9082-4b15-9b62-e83518524cd2"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 470, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_total": 268423901184, "size_available": 263514284032, "block_size": 4096, "block_total": 65533179, "block_available": 64334542, "block_used": 1198637, "inode_total": 131071472, "inode_available": 130994299, "inode_used": 77173, "uuid": "fe591198-9082-4b15-9b62-e83518524cd2"}], "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 6, "micro": 8, "releaselevel": "final", "serial": 0}, "version_info": [3, 6, 8, "final", 0], "executable": "/usr/libexec/platform-python", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_loadavg": {"1m": 0.61, "5m": 0.31, "15m": 0.15}, "ansible_hostnqn": "", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "rx_udp_gro_forwarding": "off", "rx_gro_list": "off", "tls_hw_rx_offload": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:b1:f0:5f:31:9b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.99", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::b1:f0ff:fe5f:319b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_gene<<< 9743 1726773055.33431: stdout chunk (state=3): >>>ric": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "rx_udp_gro_forwarding": "off", "rx_gro_list": "off", "tls_hw_rx_offload": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.99", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:b1:f0:5f:31:9b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.99"], "ansible_all_ipv6_addresses": ["fe80::b1:f0ff:fe5f:319b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.99", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::b1:f0ff:fe5f:319b"]}, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"LS_COLORS": "", "SSH_CONNECTION": "10.31.14.7 52438 10.31.47.99 22", "_": "/usr/libexec/platform-python", "LANG": "en_US.UTF-8", "which_declare": "declare -f", "XDG_SESSION_ID": "6", "USER": "root", "SELINUX_ROLE_REQUESTED": "", "PWD": "/root", "HOME": "/root", "SSH_CLIENT": "10.31.14.7 52438 22", "SELINUX_LEVEL_REQUESTED": "", "SSH_TTY": "/dev/pts/0", "SHELL": "/bin/bash", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "2", "LOGNAME": "root", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "XDG_RUNTIME_DIR": "/run/user/0", "PATH": "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 9743 1726773055.35032: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 9743 1726773055.35080: stderr chunk (state=3): >>><<< 9743 1726773055.35088: stdout chunk (state=3): >>><<< 9743 1726773055.35112: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "4.18.0-553.5.1.el8.x86_64", "ansible_kernel_version": "#1 SMP Tue May 21 05:46:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.6.8", "ansible_fqdn": "ip-10-31-47-99.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-47-99", "ansible_nodename": "ip-10-31-47-99.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "08bd8a2f207a4b8ca89af5a29ce2e4fa", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Thursday", "weekday_number": "4", "weeknumber": "38", "day": "19", "hour": "15", "minute": "10", "second": "55", "epoch": "1726773055", "epoch_int": "1726773055", "date": "2024-09-19", "time": "15:10:55", "iso8601_micro": "2024-09-19T19:10:55.063413Z", "iso8601": "2024-09-19T19:10:55Z", "iso8601_basic": "20240919T151055063413", "iso8601_basic_short": "20240919T151055", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANq2lB8EQsesuM3VrW5PfnDEMyZj3K+FFHiJf3WGGIKsYitYJGRKmjyQ+t+FN8efSaWyATBrW2J9UQQu6foxmIbeQD+JUrANokQrzz7Sa7PkBjVEEeJuID0kF73Y+wTPKLeGyyGbF8ST7/mIirwMLgNF9rYjJSTlSR9s6ngk6a09AAAAFQDpx7XaB+Qw4TNJdgutuLdKJDnHBQAAAIEAilO02VLThxnGAc9LGF87asL3RLULaVv1vX/07fPx9WJ7lXgCnIe/2pOE58lypdix9BaiRgz0vvJcEGj7jRMTYpqrC4Xmmis6YlGAtU03ynSa96bqHTpYE2RQXQm1IhDS8UHCV50LWC6KkQium4FH88Kk0JFNiP+2D9oZgqxp3XcAAACBAKKbrhxDgjoiAGcG0Q6pr/zv5mEt39DpWF2V0oHuRo++LCEEo+1kRmNTQcvSOF5B6WX7ajelMNb5uEpe/ehylQ/0gRV4Hrqze8Lek31towkWtZpMM5h1eE3gsXao625iSUUU57IOM0ssRiarIQWz0PydCw02F1gc8RLJz1/tQaUd", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDwbGRzV+PLJzG8aSAcqOY8RTd9sTHhGDA/Nzd+VP1W1/P6NLxRuaQowVrHflI0lMtLCzYSRIWwO1UTRON0/nrIoejGiNd97mckGaK1FR3Ps9PXkYaIq9kT7bf4y2kEDiJ/eWALKqlLuY2lsiskmaPRkAqd7u7fIAU/1RNphI7syU1vQcbZZz1ygKXXg5/yn5kYvHX3p8zy6jZY1gofkNJFv3F2dFDojQbUWAL1X3Rj5XINn7lVsy4T1bRjKQbPVtAB482E73zFS5xFPXvW5/Olly46R30LQztq/sOSMmmNCgFEafBsy8xwYw0zY3pTYbPf2TGIeXO0KMwxYV6aqbAuyzBqHLUWoiUA6wyx/CnVFlGv5cdYwAu+LTM3eZv0nSlVFeJvVnzH2oyyIj4OZCq37QCKWn4to7agNegKwEGvcMdU3v1Pm0FdCJyKrxVVs78XVRgcHoIhCP2NBxqExcpsNhNtxCoiXfQ8gNlLk4g2ID9T6qXej5EKEbM5+UOnVVU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBELSvfJvqPJMYQwTTk5JfFg5rIIsSQutxxyNg6uPgYcegQfl4NFd7KUg0S8GYsFMKgA4UDc2kjRfzDhg5BMkN2Q=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOAyn0xVKMIeJv8cnuXIonl9H3rUCR6uyGL4mdOW4gcQ", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "8", "ansible_distribution_major_version": "8", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3539, "ansible_memfree_mb": 2699, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3539, "used": 840, "free": 2699}, "nocache": {"free": 3297, "used": 242}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ad592-3385-4c88-65df-904ee15ade02", "ansible_product_uuid": "ec2ad592-3385-4c88-65df-904ee15ade02", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "0", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["fe591198-9082-4b15-9b62-e83518524cd2"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["fe591198-9082-4b15-9b62-e83518524cd2"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 470, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_total": 268423901184, "size_available": 263514284032, "block_size": 4096, "block_total": 65533179, "block_available": 64334542, "block_used": 1198637, "inode_total": 131071472, "inode_available": 130994299, "inode_used": 77173, "uuid": "fe591198-9082-4b15-9b62-e83518524cd2"}], "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 6, "micro": 8, "releaselevel": "final", "serial": 0}, "version_info": [3, 6, 8, "final", 0], "executable": "/usr/libexec/platform-python", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_loadavg": {"1m": 0.61, "5m": 0.31, "15m": 0.15}, "ansible_hostnqn": "", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "rx_udp_gro_forwarding": "off", "rx_gro_list": "off", "tls_hw_rx_offload": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:b1:f0:5f:31:9b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.99", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::b1:f0ff:fe5f:319b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "rx_udp_gro_forwarding": "off", "rx_gro_list": "off", "tls_hw_rx_offload": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.99", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:b1:f0:5f:31:9b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.99"], "ansible_all_ipv6_addresses": ["fe80::b1:f0ff:fe5f:319b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.99", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::b1:f0ff:fe5f:319b"]}, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"LS_COLORS": "", "SSH_CONNECTION": "10.31.14.7 52438 10.31.47.99 22", "_": "/usr/libexec/platform-python", "LANG": "en_US.UTF-8", "which_declare": "declare -f", "XDG_SESSION_ID": "6", "USER": "root", "SELINUX_ROLE_REQUESTED": "", "PWD": "/root", "HOME": "/root", "SSH_CLIENT": "10.31.14.7 52438 22", "SELINUX_LEVEL_REQUESTED": "", "SSH_TTY": "/dev/pts/0", "SHELL": "/bin/bash", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "2", "LOGNAME": "root", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "XDG_RUNTIME_DIR": "/run/user/0", "PATH": "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=Shared connection to 10.31.47.99 closed. 9743 1726773055.36003: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773054.328082-9743-182808520018783/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9743 1726773055.36022: _low_level_execute_command(): starting 9743 1726773055.36029: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773054.328082-9743-182808520018783/ > /dev/null 2>&1 && sleep 0' 9743 1726773055.38515: stderr chunk (state=2): >>><<< 9743 1726773055.38534: stdout chunk (state=3): >>><<< 9743 1726773055.38546: _low_level_execute_command() done: rc=0, stdout=, stderr= 9743 1726773055.38556: handler run complete 9743 1726773055.38627: variable 'ansible_facts' from source: unknown 9743 1726773055.38695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9743 1726773055.38866: variable 'ansible_facts' from source: unknown 9743 1726773055.38933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9743 1726773055.39013: attempt loop complete, returning result 9743 1726773055.39019: _execute() done 9743 1726773055.39024: dumping result to json 9743 1726773055.39041: done dumping result, returning 9743 1726773055.39048: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0affffe7-6841-7dd6-8fa6-000000000014] 9743 1726773055.39054: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000014 9743 1726773055.39171: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000014 9743 1726773055.39175: WORKER PROCESS EXITING ok: [managed_node3] 9733 1726773055.41294: no more pending results, returning what we have 9733 1726773055.41296: results queue empty 9733 1726773055.41297: checking for any_errors_fatal 9733 1726773055.41298: done checking for any_errors_fatal 9733 1726773055.41298: checking for max_fail_percentage 9733 1726773055.41299: done checking for max_fail_percentage 9733 1726773055.41299: checking to see if all hosts have failed and the running result is not ok 9733 1726773055.41299: done checking to see if all hosts have failed 9733 1726773055.41300: getting the remaining hosts for this loop 9733 1726773055.41301: done getting the remaining hosts for this loop 9733 1726773055.41303: getting the next task for host managed_node3 9733 1726773055.41306: done getting next task for host managed_node3 9733 1726773055.41307: ^ task is: TASK: meta (flush_handlers) 9733 1726773055.41308: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773055.41311: getting variables 9733 1726773055.41311: in VariableManager get_vars() 9733 1726773055.41323: Calling all_inventory to load vars for managed_node3 9733 1726773055.41325: Calling groups_inventory to load vars for managed_node3 9733 1726773055.41327: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773055.41332: Calling all_plugins_play to load vars for managed_node3 9733 1726773055.41333: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773055.41335: Calling groups_plugins_play to load vars for managed_node3 9733 1726773055.41436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773055.41544: done with get_vars() 9733 1726773055.41551: done getting variables 9733 1726773055.41583: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 9733 1726773055.41620: in VariableManager get_vars() 9733 1726773055.41626: Calling all_inventory to load vars for managed_node3 9733 1726773055.41628: Calling groups_inventory to load vars for managed_node3 9733 1726773055.41629: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773055.41632: Calling all_plugins_play to load vars for managed_node3 9733 1726773055.41633: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773055.41634: Calling groups_plugins_play to load vars for managed_node3 9733 1726773055.41715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773055.41831: done with get_vars() 9733 1726773055.41841: done queuing things up, now waiting for results queue to drain 9733 1726773055.41842: results queue empty 9733 1726773055.41843: checking for any_errors_fatal 9733 1726773055.41845: done checking for any_errors_fatal 9733 1726773055.41845: checking for max_fail_percentage 9733 1726773055.41846: done checking for max_fail_percentage 9733 1726773055.41846: checking to see if all hosts have failed and the running result is not ok 9733 1726773055.41846: done checking to see if all hosts have failed 9733 1726773055.41847: getting the remaining hosts for this loop 9733 1726773055.41847: done getting the remaining hosts for this loop 9733 1726773055.41849: getting the next task for host managed_node3 9733 1726773055.41851: done getting next task for host managed_node3 9733 1726773055.41852: ^ task is: TASK: Set platform independent vars used by this test 9733 1726773055.41853: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773055.41855: getting variables 9733 1726773055.41856: in VariableManager get_vars() 9733 1726773055.41861: Calling all_inventory to load vars for managed_node3 9733 1726773055.41862: Calling groups_inventory to load vars for managed_node3 9733 1726773055.41863: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773055.41869: Calling all_plugins_play to load vars for managed_node3 9733 1726773055.41871: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773055.41873: Calling groups_plugins_play to load vars for managed_node3 9733 1726773055.41948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773055.42047: done with get_vars() 9733 1726773055.42054: done getting variables 9733 1726773055.42117: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set platform independent vars used by this test] ************************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_simple_settings.yml:10 Thursday 19 September 2024 15:10:55 -0400 (0:00:01.141) 0:00:01.153 **** 9733 1726773055.42138: entering _queue_task() for managed_node3/include_vars 9733 1726773055.42139: Creating lock for include_vars 9733 1726773055.42325: worker is 1 (out of 1 available) 9733 1726773055.42336: exiting _queue_task() for managed_node3/include_vars 9733 1726773055.42346: done queuing things up, now waiting for results queue to drain 9733 1726773055.42348: waiting for pending results... 9782 1726773055.42499: running TaskExecutor() for managed_node3/TASK: Set platform independent vars used by this test 9782 1726773055.42599: in run() - task 0affffe7-6841-7dd6-8fa6-000000000006 9782 1726773055.42616: variable 'ansible_search_path' from source: unknown 9782 1726773055.42645: calling self._execute() 9782 1726773055.42704: variable 'ansible_host' from source: host vars for 'managed_node3' 9782 1726773055.42714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9782 1726773055.42724: variable 'omit' from source: magic vars 9782 1726773055.42798: variable 'omit' from source: magic vars 9782 1726773055.42822: variable 'omit' from source: magic vars 9782 1726773055.42847: variable 'omit' from source: magic vars 9782 1726773055.42882: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9782 1726773055.42911: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9782 1726773055.42930: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9782 1726773055.42946: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9782 1726773055.42957: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9782 1726773055.42980: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9782 1726773055.42984: variable 'ansible_host' from source: host vars for 'managed_node3' 9782 1726773055.42990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9782 1726773055.43056: Set connection var ansible_timeout to 10 9782 1726773055.43062: Set connection var ansible_shell_type to sh 9782 1726773055.43066: Set connection var ansible_module_compression to ZIP_DEFLATED 9782 1726773055.43071: Set connection var ansible_shell_executable to /bin/sh 9782 1726773055.43075: Set connection var ansible_pipelining to False 9782 1726773055.43078: Set connection var ansible_connection to ssh 9782 1726773055.43093: variable 'ansible_shell_executable' from source: unknown 9782 1726773055.43096: variable 'ansible_connection' from source: unknown 9782 1726773055.43097: variable 'ansible_module_compression' from source: unknown 9782 1726773055.43099: variable 'ansible_shell_type' from source: unknown 9782 1726773055.43101: variable 'ansible_shell_executable' from source: unknown 9782 1726773055.43103: variable 'ansible_host' from source: host vars for 'managed_node3' 9782 1726773055.43105: variable 'ansible_pipelining' from source: unknown 9782 1726773055.43106: variable 'ansible_timeout' from source: unknown 9782 1726773055.43108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9782 1726773055.43214: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9782 1726773055.43226: variable 'omit' from source: magic vars 9782 1726773055.43233: starting attempt loop 9782 1726773055.43237: running the handler 9782 1726773055.43246: evaluation_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings 9782 1726773055.43266: search_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/vars/vars/vars_simple_settings.yml /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/vars/vars_simple_settings.yml /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/vars/vars/vars_simple_settings.yml /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/vars/vars_simple_settings.yml 9782 1726773055.43424: handler run complete 9782 1726773055.43435: attempt loop complete, returning result 9782 1726773055.43439: _execute() done 9782 1726773055.43443: dumping result to json 9782 1726773055.43448: done dumping result, returning 9782 1726773055.43454: done running TaskExecutor() for managed_node3/TASK: Set platform independent vars used by this test [0affffe7-6841-7dd6-8fa6-000000000006] 9782 1726773055.43458: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000006 9782 1726773055.43484: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000006 9782 1726773055.43547: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__kernel_settings_profile_file": "# File managed by Ansible - DO NOT EDIT\n[main]\nsummary = kernel settings\n[sysctl]\nfs.epoll.max_user_watches = 785592\nfs.file-max = 379724\n[vm]\ntransparent_hugepages = madvise\n[sysfs]\n/sys/kernel/debug/x86/ibrs_enabled = 0\n/sys/kernel/debug/x86/pti_enabled = 0\n/sys/kernel/debug/x86/retp_enabled = 0\n{% if __kernel_settings_blcmdline_value | d() %}\n[bootloader]\ncmdline = {{ __kernel_settings_blcmdline_value }}\n{% endif %}\n", "kernel_settings_sysctl": [ { "name": "fs.epoll.max_user_watches", "value": 785592 }, { "name": "fs.file-max", "value": 379724 }, { "name": "no.such.param", "state": "absent" } ], "kernel_settings_sysfs": [ { "name": "/sys/kernel/debug/x86/pti_enabled", "value": 0 }, { "name": "/sys/kernel/debug/x86/retp_enabled", "value": 0 }, { "name": "/sys/kernel/debug/x86/ibrs_enabled", "value": 0 }, { "name": "/sys/not/found", "state": "absent" } ], "kernel_settings_systemd_cpu_affinity": { "state": "absent" }, "kernel_settings_transparent_hugepages": "madvise", "kernel_settings_transparent_hugepages_defrag": { "state": "absent" } }, "ansible_included_var_files": [ "/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/vars/vars_simple_settings.yml" ], "changed": false } 9733 1726773055.43701: no more pending results, returning what we have 9733 1726773055.43703: results queue empty 9733 1726773055.43704: checking for any_errors_fatal 9733 1726773055.43705: done checking for any_errors_fatal 9733 1726773055.43705: checking for max_fail_percentage 9733 1726773055.43706: done checking for max_fail_percentage 9733 1726773055.43707: checking to see if all hosts have failed and the running result is not ok 9733 1726773055.43707: done checking to see if all hosts have failed 9733 1726773055.43708: getting the remaining hosts for this loop 9733 1726773055.43709: done getting the remaining hosts for this loop 9733 1726773055.43711: getting the next task for host managed_node3 9733 1726773055.43714: done getting next task for host managed_node3 9733 1726773055.43716: ^ task is: TASK: Disable bootloader cmdline testing on Fedora 9733 1726773055.43718: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773055.43720: getting variables 9733 1726773055.43721: in VariableManager get_vars() 9733 1726773055.43742: Calling all_inventory to load vars for managed_node3 9733 1726773055.43744: Calling groups_inventory to load vars for managed_node3 9733 1726773055.43747: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773055.43754: Calling all_plugins_play to load vars for managed_node3 9733 1726773055.43756: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773055.43759: Calling groups_plugins_play to load vars for managed_node3 9733 1726773055.43849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773055.43954: done with get_vars() 9733 1726773055.43960: done getting variables 9733 1726773055.44024: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Disable bootloader cmdline testing on Fedora] **************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_simple_settings.yml:14 Thursday 19 September 2024 15:10:55 -0400 (0:00:00.019) 0:00:01.172 **** 9733 1726773055.44041: entering _queue_task() for managed_node3/set_fact 9733 1726773055.44042: Creating lock for set_fact 9733 1726773055.44202: worker is 1 (out of 1 available) 9733 1726773055.44215: exiting _queue_task() for managed_node3/set_fact 9733 1726773055.44226: done queuing things up, now waiting for results queue to drain 9733 1726773055.44228: waiting for pending results... 9783 1726773055.44316: running TaskExecutor() for managed_node3/TASK: Disable bootloader cmdline testing on Fedora 9783 1726773055.44404: in run() - task 0affffe7-6841-7dd6-8fa6-000000000007 9783 1726773055.44419: variable 'ansible_search_path' from source: unknown 9783 1726773055.44446: calling self._execute() 9783 1726773055.44498: variable 'ansible_host' from source: host vars for 'managed_node3' 9783 1726773055.44505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9783 1726773055.44512: variable 'omit' from source: magic vars 9783 1726773055.44832: variable 'ansible_distribution' from source: facts 9783 1726773055.44851: Evaluated conditional (ansible_distribution == "Fedora"): False 9783 1726773055.44856: when evaluation is False, skipping this task 9783 1726773055.44859: _execute() done 9783 1726773055.44863: dumping result to json 9783 1726773055.44867: done dumping result, returning 9783 1726773055.44875: done running TaskExecutor() for managed_node3/TASK: Disable bootloader cmdline testing on Fedora [0affffe7-6841-7dd6-8fa6-000000000007] 9783 1726773055.44882: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000007 9783 1726773055.44908: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000007 9783 1726773055.44911: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution == \"Fedora\"", "skip_reason": "Conditional result was False" } 9733 1726773055.45031: no more pending results, returning what we have 9733 1726773055.45034: results queue empty 9733 1726773055.45035: checking for any_errors_fatal 9733 1726773055.45038: done checking for any_errors_fatal 9733 1726773055.45039: checking for max_fail_percentage 9733 1726773055.45040: done checking for max_fail_percentage 9733 1726773055.45040: checking to see if all hosts have failed and the running result is not ok 9733 1726773055.45041: done checking to see if all hosts have failed 9733 1726773055.45041: getting the remaining hosts for this loop 9733 1726773055.45043: done getting the remaining hosts for this loop 9733 1726773055.45045: getting the next task for host managed_node3 9733 1726773055.45050: done getting next task for host managed_node3 9733 1726773055.45052: ^ task is: TASK: Apply the settings - call the role 9733 1726773055.45054: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773055.45057: getting variables 9733 1726773055.45058: in VariableManager get_vars() 9733 1726773055.45082: Calling all_inventory to load vars for managed_node3 9733 1726773055.45084: Calling groups_inventory to load vars for managed_node3 9733 1726773055.45087: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773055.45095: Calling all_plugins_play to load vars for managed_node3 9733 1726773055.45097: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773055.45099: Calling groups_plugins_play to load vars for managed_node3 9733 1726773055.45196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773055.45323: done with get_vars() 9733 1726773055.45330: done getting variables TASK [Apply the settings - call the role] ************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_simple_settings.yml:25 Thursday 19 September 2024 15:10:55 -0400 (0:00:00.013) 0:00:01.186 **** 9733 1726773055.45389: entering _queue_task() for managed_node3/include_role 9733 1726773055.45391: Creating lock for include_role 9733 1726773055.45549: worker is 1 (out of 1 available) 9733 1726773055.45562: exiting _queue_task() for managed_node3/include_role 9733 1726773055.45573: done queuing things up, now waiting for results queue to drain 9733 1726773055.45575: waiting for pending results... 9784 1726773055.45660: running TaskExecutor() for managed_node3/TASK: Apply the settings - call the role 9784 1726773055.45753: in run() - task 0affffe7-6841-7dd6-8fa6-000000000008 9784 1726773055.45771: variable 'ansible_search_path' from source: unknown 9784 1726773055.45802: calling self._execute() 9784 1726773055.45852: variable 'ansible_host' from source: host vars for 'managed_node3' 9784 1726773055.45860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9784 1726773055.45870: variable 'omit' from source: magic vars 9784 1726773055.45940: _execute() done 9784 1726773055.45945: dumping result to json 9784 1726773055.45949: done dumping result, returning 9784 1726773055.45955: done running TaskExecutor() for managed_node3/TASK: Apply the settings - call the role [0affffe7-6841-7dd6-8fa6-000000000008] 9784 1726773055.45963: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000008 9784 1726773055.45990: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000008 9784 1726773055.45994: WORKER PROCESS EXITING 9733 1726773055.46086: no more pending results, returning what we have 9733 1726773055.46091: in VariableManager get_vars() 9733 1726773055.46116: Calling all_inventory to load vars for managed_node3 9733 1726773055.46119: Calling groups_inventory to load vars for managed_node3 9733 1726773055.46121: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773055.46129: Calling all_plugins_play to load vars for managed_node3 9733 1726773055.46131: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773055.46134: Calling groups_plugins_play to load vars for managed_node3 9733 1726773055.46233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773055.46334: done with get_vars() 9733 1726773055.46341: variable 'ansible_search_path' from source: unknown 9733 1726773055.46487: variable 'omit' from source: magic vars 9733 1726773055.46502: variable 'omit' from source: magic vars 9733 1726773055.46512: variable 'omit' from source: magic vars 9733 1726773055.46516: we have included files to process 9733 1726773055.46517: generating all_blocks data 9733 1726773055.46518: done generating all_blocks data 9733 1726773055.46518: processing included file: fedora.linux_system_roles.kernel_settings 9733 1726773055.46532: in VariableManager get_vars() 9733 1726773055.46539: done with get_vars() 9733 1726773055.46583: in VariableManager get_vars() 9733 1726773055.46594: done with get_vars() 9733 1726773055.46620: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/main.yml 9733 1726773055.46724: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/defaults/main.yml 9733 1726773055.46764: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/meta/main.yml 9733 1726773055.46845: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml 9733 1726773055.47318: in VariableManager get_vars() 9733 1726773055.47330: done with get_vars() 9733 1726773055.48231: in VariableManager get_vars() 9733 1726773055.48243: done with get_vars() 9733 1726773055.48349: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/handlers/main.yml 9733 1726773055.48760: iterating over new_blocks loaded from include file 9733 1726773055.48762: in VariableManager get_vars() 9733 1726773055.48772: done with get_vars() 9733 1726773055.48773: filtering new block on tags 9733 1726773055.48799: done filtering new block on tags 9733 1726773055.48801: in VariableManager get_vars() 9733 1726773055.48812: done with get_vars() 9733 1726773055.48813: filtering new block on tags 9733 1726773055.48836: done filtering new block on tags 9733 1726773055.48837: in VariableManager get_vars() 9733 1726773055.48845: done with get_vars() 9733 1726773055.48846: filtering new block on tags 9733 1726773055.48934: done filtering new block on tags 9733 1726773055.48936: in VariableManager get_vars() 9733 1726773055.48945: done with get_vars() 9733 1726773055.48946: filtering new block on tags 9733 1726773055.48956: done filtering new block on tags 9733 1726773055.48957: done iterating over new_blocks loaded from include file 9733 1726773055.48957: extending task lists for all hosts with included blocks 9733 1726773055.49078: done extending task lists 9733 1726773055.49079: done processing included files 9733 1726773055.49080: results queue empty 9733 1726773055.49080: checking for any_errors_fatal 9733 1726773055.49082: done checking for any_errors_fatal 9733 1726773055.49083: checking for max_fail_percentage 9733 1726773055.49083: done checking for max_fail_percentage 9733 1726773055.49084: checking to see if all hosts have failed and the running result is not ok 9733 1726773055.49084: done checking to see if all hosts have failed 9733 1726773055.49086: getting the remaining hosts for this loop 9733 1726773055.49087: done getting the remaining hosts for this loop 9733 1726773055.49088: getting the next task for host managed_node3 9733 1726773055.49091: done getting next task for host managed_node3 9733 1726773055.49092: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 9733 1726773055.49094: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773055.49100: getting variables 9733 1726773055.49100: in VariableManager get_vars() 9733 1726773055.49109: Calling all_inventory to load vars for managed_node3 9733 1726773055.49110: Calling groups_inventory to load vars for managed_node3 9733 1726773055.49111: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773055.49115: Calling all_plugins_play to load vars for managed_node3 9733 1726773055.49116: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773055.49118: Calling groups_plugins_play to load vars for managed_node3 9733 1726773055.49216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773055.49322: done with get_vars() 9733 1726773055.49328: done getting variables 9733 1726773055.49376: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:2 Thursday 19 September 2024 15:10:55 -0400 (0:00:00.040) 0:00:01.226 **** 9733 1726773055.49398: entering _queue_task() for managed_node3/fail 9733 1726773055.49400: Creating lock for fail 9733 1726773055.49604: worker is 1 (out of 1 available) 9733 1726773055.49618: exiting _queue_task() for managed_node3/fail 9733 1726773055.49629: done queuing things up, now waiting for results queue to drain 9733 1726773055.49632: waiting for pending results... 9785 1726773055.49736: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 9785 1726773055.49842: in run() - task 0affffe7-6841-7dd6-8fa6-000000000047 9785 1726773055.49857: variable 'ansible_search_path' from source: unknown 9785 1726773055.49862: variable 'ansible_search_path' from source: unknown 9785 1726773055.49894: calling self._execute() 9785 1726773055.49951: variable 'ansible_host' from source: host vars for 'managed_node3' 9785 1726773055.49960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9785 1726773055.49970: variable 'omit' from source: magic vars 9785 1726773055.50309: variable 'kernel_settings_sysctl' from source: include_vars 9785 1726773055.50325: variable '__kernel_settings_state_empty' from source: role '' all vars 9785 1726773055.50334: Evaluated conditional (kernel_settings_sysctl != __kernel_settings_state_empty): True 9785 1726773055.50536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9785 1726773055.52218: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9785 1726773055.52265: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9785 1726773055.52297: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9785 1726773055.52324: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9785 1726773055.52345: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9785 1726773055.52403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9785 1726773055.52423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9785 1726773055.52439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9785 1726773055.52464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9785 1726773055.52473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9785 1726773055.52510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9785 1726773055.52527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9785 1726773055.52541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9785 1726773055.52564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9785 1726773055.52573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9785 1726773055.52617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9785 1726773055.52636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9785 1726773055.52654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9785 1726773055.52681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9785 1726773055.52694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9785 1726773055.52872: variable 'kernel_settings_sysctl' from source: include_vars 9785 1726773055.52926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9785 1726773055.53045: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9785 1726773055.53074: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9785 1726773055.53100: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9785 1726773055.53123: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9785 1726773055.53152: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9785 1726773055.53171: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9785 1726773055.53192: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9785 1726773055.53211: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9785 1726773055.53240: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9785 1726773055.53252: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9785 1726773055.53265: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9785 1726773055.53281: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9785 1726773055.53304: Evaluated conditional ((kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", true) | list | length > 0) or (kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", false) | list | length > 0)): False 9785 1726773055.53308: when evaluation is False, skipping this task 9785 1726773055.53310: _execute() done 9785 1726773055.53312: dumping result to json 9785 1726773055.53314: done dumping result, returning 9785 1726773055.53319: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values [0affffe7-6841-7dd6-8fa6-000000000047] 9785 1726773055.53323: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000047 9785 1726773055.53343: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000047 9785 1726773055.53345: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "(kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", true) | list | length > 0) or (kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", false) | list | length > 0)", "skip_reason": "Conditional result was False" } 9733 1726773055.53630: no more pending results, returning what we have 9733 1726773055.53632: results queue empty 9733 1726773055.53633: checking for any_errors_fatal 9733 1726773055.53634: done checking for any_errors_fatal 9733 1726773055.53635: checking for max_fail_percentage 9733 1726773055.53636: done checking for max_fail_percentage 9733 1726773055.53636: checking to see if all hosts have failed and the running result is not ok 9733 1726773055.53636: done checking to see if all hosts have failed 9733 1726773055.53637: getting the remaining hosts for this loop 9733 1726773055.53637: done getting the remaining hosts for this loop 9733 1726773055.53640: getting the next task for host managed_node3 9733 1726773055.53644: done getting next task for host managed_node3 9733 1726773055.53647: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 9733 1726773055.53648: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773055.53657: getting variables 9733 1726773055.53658: in VariableManager get_vars() 9733 1726773055.53683: Calling all_inventory to load vars for managed_node3 9733 1726773055.53687: Calling groups_inventory to load vars for managed_node3 9733 1726773055.53688: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773055.53695: Calling all_plugins_play to load vars for managed_node3 9733 1726773055.53696: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773055.53698: Calling groups_plugins_play to load vars for managed_node3 9733 1726773055.53804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773055.53917: done with get_vars() 9733 1726773055.53926: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Set version specific variables] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:9 Thursday 19 September 2024 15:10:55 -0400 (0:00:00.045) 0:00:01.272 **** 9733 1726773055.53993: entering _queue_task() for managed_node3/include_tasks 9733 1726773055.53994: Creating lock for include_tasks 9733 1726773055.54175: worker is 1 (out of 1 available) 9733 1726773055.54191: exiting _queue_task() for managed_node3/include_tasks 9733 1726773055.54201: done queuing things up, now waiting for results queue to drain 9733 1726773055.54203: waiting for pending results... 9786 1726773055.54301: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 9786 1726773055.54406: in run() - task 0affffe7-6841-7dd6-8fa6-000000000048 9786 1726773055.54423: variable 'ansible_search_path' from source: unknown 9786 1726773055.54428: variable 'ansible_search_path' from source: unknown 9786 1726773055.54455: calling self._execute() 9786 1726773055.54516: variable 'ansible_host' from source: host vars for 'managed_node3' 9786 1726773055.54525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9786 1726773055.54535: variable 'omit' from source: magic vars 9786 1726773055.54607: _execute() done 9786 1726773055.54612: dumping result to json 9786 1726773055.54617: done dumping result, returning 9786 1726773055.54623: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables [0affffe7-6841-7dd6-8fa6-000000000048] 9786 1726773055.54631: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000048 9786 1726773055.54655: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000048 9786 1726773055.54659: WORKER PROCESS EXITING 9733 1726773055.54780: no more pending results, returning what we have 9733 1726773055.54786: in VariableManager get_vars() 9733 1726773055.54818: Calling all_inventory to load vars for managed_node3 9733 1726773055.54821: Calling groups_inventory to load vars for managed_node3 9733 1726773055.54823: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773055.54831: Calling all_plugins_play to load vars for managed_node3 9733 1726773055.54833: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773055.54838: Calling groups_plugins_play to load vars for managed_node3 9733 1726773055.54971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773055.55081: done with get_vars() 9733 1726773055.55089: variable 'ansible_search_path' from source: unknown 9733 1726773055.55090: variable 'ansible_search_path' from source: unknown 9733 1726773055.55113: we have included files to process 9733 1726773055.55114: generating all_blocks data 9733 1726773055.55115: done generating all_blocks data 9733 1726773055.55119: processing included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 9733 1726773055.55120: loading included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 9733 1726773055.55121: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml included: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml for managed_node3 9733 1726773055.55649: done processing included file 9733 1726773055.55651: iterating over new_blocks loaded from include file 9733 1726773055.55652: in VariableManager get_vars() 9733 1726773055.55665: done with get_vars() 9733 1726773055.55666: filtering new block on tags 9733 1726773055.55687: done filtering new block on tags 9733 1726773055.55689: in VariableManager get_vars() 9733 1726773055.55702: done with get_vars() 9733 1726773055.55703: filtering new block on tags 9733 1726773055.55727: done filtering new block on tags 9733 1726773055.55729: in VariableManager get_vars() 9733 1726773055.55740: done with get_vars() 9733 1726773055.55742: filtering new block on tags 9733 1726773055.55898: done filtering new block on tags 9733 1726773055.55900: in VariableManager get_vars() 9733 1726773055.55913: done with get_vars() 9733 1726773055.55914: filtering new block on tags 9733 1726773055.55930: done filtering new block on tags 9733 1726773055.55932: done iterating over new_blocks loaded from include file 9733 1726773055.55932: extending task lists for all hosts with included blocks 9733 1726773055.56021: done extending task lists 9733 1726773055.56022: done processing included files 9733 1726773055.56022: results queue empty 9733 1726773055.56022: checking for any_errors_fatal 9733 1726773055.56024: done checking for any_errors_fatal 9733 1726773055.56024: checking for max_fail_percentage 9733 1726773055.56025: done checking for max_fail_percentage 9733 1726773055.56025: checking to see if all hosts have failed and the running result is not ok 9733 1726773055.56026: done checking to see if all hosts have failed 9733 1726773055.56026: getting the remaining hosts for this loop 9733 1726773055.56027: done getting the remaining hosts for this loop 9733 1726773055.56028: getting the next task for host managed_node3 9733 1726773055.56031: done getting next task for host managed_node3 9733 1726773055.56033: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 9733 1726773055.56035: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773055.56042: getting variables 9733 1726773055.56042: in VariableManager get_vars() 9733 1726773055.56050: Calling all_inventory to load vars for managed_node3 9733 1726773055.56051: Calling groups_inventory to load vars for managed_node3 9733 1726773055.56052: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773055.56056: Calling all_plugins_play to load vars for managed_node3 9733 1726773055.56057: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773055.56059: Calling groups_plugins_play to load vars for managed_node3 9733 1726773055.56135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773055.56242: done with get_vars() 9733 1726773055.56248: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:2 Thursday 19 September 2024 15:10:55 -0400 (0:00:00.023) 0:00:01.295 **** 9733 1726773055.56299: entering _queue_task() for managed_node3/setup 9733 1726773055.56487: worker is 1 (out of 1 available) 9733 1726773055.56503: exiting _queue_task() for managed_node3/setup 9733 1726773055.56514: done queuing things up, now waiting for results queue to drain 9733 1726773055.56517: waiting for pending results... 9787 1726773055.56616: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 9787 1726773055.56740: in run() - task 0affffe7-6841-7dd6-8fa6-0000000000bb 9787 1726773055.56756: variable 'ansible_search_path' from source: unknown 9787 1726773055.56760: variable 'ansible_search_path' from source: unknown 9787 1726773055.56789: calling self._execute() 9787 1726773055.56849: variable 'ansible_host' from source: host vars for 'managed_node3' 9787 1726773055.56858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9787 1726773055.56866: variable 'omit' from source: magic vars 9787 1726773055.57266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9787 1726773055.58789: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9787 1726773055.58844: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9787 1726773055.58874: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9787 1726773055.58904: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9787 1726773055.58924: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9787 1726773055.58980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9787 1726773055.59006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9787 1726773055.59025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9787 1726773055.59052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9787 1726773055.59063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9787 1726773055.59105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9787 1726773055.59124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9787 1726773055.59142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9787 1726773055.59167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9787 1726773055.59179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9787 1726773055.59301: variable '__kernel_settings_required_facts' from source: role '' all vars 9787 1726773055.59312: variable 'ansible_facts' from source: unknown 9787 1726773055.59367: Evaluated conditional (__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 9787 1726773055.59372: when evaluation is False, skipping this task 9787 1726773055.59374: _execute() done 9787 1726773055.59377: dumping result to json 9787 1726773055.59378: done dumping result, returning 9787 1726773055.59383: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role [0affffe7-6841-7dd6-8fa6-0000000000bb] 9787 1726773055.59400: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000000bb 9787 1726773055.59427: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000000bb 9787 1726773055.59430: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } 9733 1726773055.59535: no more pending results, returning what we have 9733 1726773055.59538: results queue empty 9733 1726773055.59539: checking for any_errors_fatal 9733 1726773055.59541: done checking for any_errors_fatal 9733 1726773055.59542: checking for max_fail_percentage 9733 1726773055.59543: done checking for max_fail_percentage 9733 1726773055.59543: checking to see if all hosts have failed and the running result is not ok 9733 1726773055.59544: done checking to see if all hosts have failed 9733 1726773055.59544: getting the remaining hosts for this loop 9733 1726773055.59546: done getting the remaining hosts for this loop 9733 1726773055.59549: getting the next task for host managed_node3 9733 1726773055.59556: done getting next task for host managed_node3 9733 1726773055.59560: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 9733 1726773055.59563: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773055.59577: getting variables 9733 1726773055.59578: in VariableManager get_vars() 9733 1726773055.59609: Calling all_inventory to load vars for managed_node3 9733 1726773055.59612: Calling groups_inventory to load vars for managed_node3 9733 1726773055.59613: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773055.59622: Calling all_plugins_play to load vars for managed_node3 9733 1726773055.59624: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773055.59626: Calling groups_plugins_play to load vars for managed_node3 9733 1726773055.59778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773055.59903: done with get_vars() 9733 1726773055.59912: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if system is ostree] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:10 Thursday 19 September 2024 15:10:55 -0400 (0:00:00.036) 0:00:01.332 **** 9733 1726773055.59976: entering _queue_task() for managed_node3/stat 9733 1726773055.60144: worker is 1 (out of 1 available) 9733 1726773055.60158: exiting _queue_task() for managed_node3/stat 9733 1726773055.60172: done queuing things up, now waiting for results queue to drain 9733 1726773055.60174: waiting for pending results... 9788 1726773055.60270: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 9788 1726773055.60379: in run() - task 0affffe7-6841-7dd6-8fa6-0000000000bd 9788 1726773055.60396: variable 'ansible_search_path' from source: unknown 9788 1726773055.60400: variable 'ansible_search_path' from source: unknown 9788 1726773055.60426: calling self._execute() 9788 1726773055.60480: variable 'ansible_host' from source: host vars for 'managed_node3' 9788 1726773055.60488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9788 1726773055.60494: variable 'omit' from source: magic vars 9788 1726773055.60820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9788 1726773055.60994: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9788 1726773055.61026: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9788 1726773055.61053: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9788 1726773055.61081: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9788 1726773055.61137: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9788 1726773055.61156: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9788 1726773055.61175: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9788 1726773055.61203: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9788 1726773055.61293: Evaluated conditional (not __kernel_settings_is_ostree is defined): True 9788 1726773055.61301: variable 'omit' from source: magic vars 9788 1726773055.61338: variable 'omit' from source: magic vars 9788 1726773055.61359: variable 'omit' from source: magic vars 9788 1726773055.61380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9788 1726773055.61402: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9788 1726773055.61417: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9788 1726773055.61430: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9788 1726773055.61440: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9788 1726773055.61462: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9788 1726773055.61467: variable 'ansible_host' from source: host vars for 'managed_node3' 9788 1726773055.61471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9788 1726773055.61539: Set connection var ansible_timeout to 10 9788 1726773055.61544: Set connection var ansible_shell_type to sh 9788 1726773055.61550: Set connection var ansible_module_compression to ZIP_DEFLATED 9788 1726773055.61556: Set connection var ansible_shell_executable to /bin/sh 9788 1726773055.61561: Set connection var ansible_pipelining to False 9788 1726773055.61567: Set connection var ansible_connection to ssh 9788 1726773055.61583: variable 'ansible_shell_executable' from source: unknown 9788 1726773055.61589: variable 'ansible_connection' from source: unknown 9788 1726773055.61591: variable 'ansible_module_compression' from source: unknown 9788 1726773055.61593: variable 'ansible_shell_type' from source: unknown 9788 1726773055.61594: variable 'ansible_shell_executable' from source: unknown 9788 1726773055.61597: variable 'ansible_host' from source: host vars for 'managed_node3' 9788 1726773055.61600: variable 'ansible_pipelining' from source: unknown 9788 1726773055.61602: variable 'ansible_timeout' from source: unknown 9788 1726773055.61604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9788 1726773055.61688: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9788 1726773055.61698: variable 'omit' from source: magic vars 9788 1726773055.61704: starting attempt loop 9788 1726773055.61706: running the handler 9788 1726773055.61715: _low_level_execute_command(): starting 9788 1726773055.61722: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9788 1726773055.64165: stdout chunk (state=2): >>>/root <<< 9788 1726773055.64291: stderr chunk (state=3): >>><<< 9788 1726773055.64299: stdout chunk (state=3): >>><<< 9788 1726773055.64314: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9788 1726773055.64324: _low_level_execute_command(): starting 9788 1726773055.64328: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773055.6431973-9788-96814344355402 `" && echo ansible-tmp-1726773055.6431973-9788-96814344355402="` echo /root/.ansible/tmp/ansible-tmp-1726773055.6431973-9788-96814344355402 `" ) && sleep 0' 9788 1726773055.66813: stdout chunk (state=2): >>>ansible-tmp-1726773055.6431973-9788-96814344355402=/root/.ansible/tmp/ansible-tmp-1726773055.6431973-9788-96814344355402 <<< 9788 1726773055.66942: stderr chunk (state=3): >>><<< 9788 1726773055.66950: stdout chunk (state=3): >>><<< 9788 1726773055.66969: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773055.6431973-9788-96814344355402=/root/.ansible/tmp/ansible-tmp-1726773055.6431973-9788-96814344355402 , stderr= 9788 1726773055.67011: variable 'ansible_module_compression' from source: unknown 9788 1726773055.67056: ANSIBALLZ: Using lock for stat 9788 1726773055.67061: ANSIBALLZ: Acquiring lock 9788 1726773055.67065: ANSIBALLZ: Lock acquired: 139792132688880 9788 1726773055.67071: ANSIBALLZ: Creating module 9788 1726773055.75930: ANSIBALLZ: Writing module into payload 9788 1726773055.76015: ANSIBALLZ: Writing module 9788 1726773055.76034: ANSIBALLZ: Renaming module 9788 1726773055.76041: ANSIBALLZ: Done creating module 9788 1726773055.76057: variable 'ansible_facts' from source: unknown 9788 1726773055.76116: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773055.6431973-9788-96814344355402/AnsiballZ_stat.py 9788 1726773055.76217: Sending initial data 9788 1726773055.76224: Sent initial data (150 bytes) 9788 1726773055.78938: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpbumtdm3w /root/.ansible/tmp/ansible-tmp-1726773055.6431973-9788-96814344355402/AnsiballZ_stat.py <<< 9788 1726773055.80108: stderr chunk (state=3): >>><<< 9788 1726773055.80116: stdout chunk (state=3): >>><<< 9788 1726773055.80134: done transferring module to remote 9788 1726773055.80145: _low_level_execute_command(): starting 9788 1726773055.80151: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773055.6431973-9788-96814344355402/ /root/.ansible/tmp/ansible-tmp-1726773055.6431973-9788-96814344355402/AnsiballZ_stat.py && sleep 0' 9788 1726773055.82589: stderr chunk (state=2): >>><<< 9788 1726773055.82600: stdout chunk (state=2): >>><<< 9788 1726773055.82615: _low_level_execute_command() done: rc=0, stdout=, stderr= 9788 1726773055.82619: _low_level_execute_command(): starting 9788 1726773055.82624: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773055.6431973-9788-96814344355402/AnsiballZ_stat.py && sleep 0' 9788 1726773055.97837: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 9788 1726773055.98908: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 9788 1726773055.98955: stderr chunk (state=3): >>><<< 9788 1726773055.98962: stdout chunk (state=3): >>><<< 9788 1726773055.98979: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.47.99 closed. 9788 1726773055.99010: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773055.6431973-9788-96814344355402/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9788 1726773055.99021: _low_level_execute_command(): starting 9788 1726773055.99027: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773055.6431973-9788-96814344355402/ > /dev/null 2>&1 && sleep 0' 9788 1726773056.01562: stderr chunk (state=2): >>><<< 9788 1726773056.01577: stdout chunk (state=2): >>><<< 9788 1726773056.01602: _low_level_execute_command() done: rc=0, stdout=, stderr= 9788 1726773056.01610: handler run complete 9788 1726773056.01624: attempt loop complete, returning result 9788 1726773056.01626: _execute() done 9788 1726773056.01628: dumping result to json 9788 1726773056.01631: done dumping result, returning 9788 1726773056.01636: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree [0affffe7-6841-7dd6-8fa6-0000000000bd] 9788 1726773056.01640: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000000bd 9788 1726773056.01667: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000000bd 9788 1726773056.01671: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 9733 1726773056.01950: no more pending results, returning what we have 9733 1726773056.01953: results queue empty 9733 1726773056.01954: checking for any_errors_fatal 9733 1726773056.01959: done checking for any_errors_fatal 9733 1726773056.01960: checking for max_fail_percentage 9733 1726773056.01961: done checking for max_fail_percentage 9733 1726773056.01962: checking to see if all hosts have failed and the running result is not ok 9733 1726773056.01962: done checking to see if all hosts have failed 9733 1726773056.01963: getting the remaining hosts for this loop 9733 1726773056.01964: done getting the remaining hosts for this loop 9733 1726773056.01967: getting the next task for host managed_node3 9733 1726773056.01975: done getting next task for host managed_node3 9733 1726773056.01978: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 9733 1726773056.01981: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773056.01991: getting variables 9733 1726773056.01992: in VariableManager get_vars() 9733 1726773056.02019: Calling all_inventory to load vars for managed_node3 9733 1726773056.02021: Calling groups_inventory to load vars for managed_node3 9733 1726773056.02023: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773056.02030: Calling all_plugins_play to load vars for managed_node3 9733 1726773056.02031: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773056.02033: Calling groups_plugins_play to load vars for managed_node3 9733 1726773056.02142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773056.02261: done with get_vars() 9733 1726773056.02493: done getting variables 9733 1726773056.02553: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:15 Thursday 19 September 2024 15:10:56 -0400 (0:00:00.426) 0:00:01.758 **** 9733 1726773056.02588: entering _queue_task() for managed_node3/set_fact 9733 1726773056.02839: worker is 1 (out of 1 available) 9733 1726773056.02853: exiting _queue_task() for managed_node3/set_fact 9733 1726773056.02864: done queuing things up, now waiting for results queue to drain 9733 1726773056.02866: waiting for pending results... 9799 1726773056.03079: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 9799 1726773056.03225: in run() - task 0affffe7-6841-7dd6-8fa6-0000000000be 9799 1726773056.03248: variable 'ansible_search_path' from source: unknown 9799 1726773056.03255: variable 'ansible_search_path' from source: unknown 9799 1726773056.03295: calling self._execute() 9799 1726773056.03367: variable 'ansible_host' from source: host vars for 'managed_node3' 9799 1726773056.03377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9799 1726773056.03390: variable 'omit' from source: magic vars 9799 1726773056.03845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9799 1726773056.04062: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9799 1726773056.04105: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9799 1726773056.04131: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9799 1726773056.04157: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9799 1726773056.04220: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9799 1726773056.04240: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9799 1726773056.04258: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9799 1726773056.04280: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9799 1726773056.04372: Evaluated conditional (not __kernel_settings_is_ostree is defined): True 9799 1726773056.04379: variable 'omit' from source: magic vars 9799 1726773056.04431: variable 'omit' from source: magic vars 9799 1726773056.04517: variable '__ostree_booted_stat' from source: set_fact 9799 1726773056.04560: variable 'omit' from source: magic vars 9799 1726773056.04580: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9799 1726773056.04605: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9799 1726773056.04622: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9799 1726773056.04636: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9799 1726773056.04646: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9799 1726773056.04671: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9799 1726773056.04676: variable 'ansible_host' from source: host vars for 'managed_node3' 9799 1726773056.04680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9799 1726773056.04753: Set connection var ansible_timeout to 10 9799 1726773056.04759: Set connection var ansible_shell_type to sh 9799 1726773056.04765: Set connection var ansible_module_compression to ZIP_DEFLATED 9799 1726773056.04772: Set connection var ansible_shell_executable to /bin/sh 9799 1726773056.04777: Set connection var ansible_pipelining to False 9799 1726773056.04784: Set connection var ansible_connection to ssh 9799 1726773056.04801: variable 'ansible_shell_executable' from source: unknown 9799 1726773056.04804: variable 'ansible_connection' from source: unknown 9799 1726773056.04808: variable 'ansible_module_compression' from source: unknown 9799 1726773056.04811: variable 'ansible_shell_type' from source: unknown 9799 1726773056.04814: variable 'ansible_shell_executable' from source: unknown 9799 1726773056.04817: variable 'ansible_host' from source: host vars for 'managed_node3' 9799 1726773056.04821: variable 'ansible_pipelining' from source: unknown 9799 1726773056.04823: variable 'ansible_timeout' from source: unknown 9799 1726773056.04825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9799 1726773056.04888: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9799 1726773056.04897: variable 'omit' from source: magic vars 9799 1726773056.04902: starting attempt loop 9799 1726773056.04904: running the handler 9799 1726773056.04911: handler run complete 9799 1726773056.04915: attempt loop complete, returning result 9799 1726773056.04917: _execute() done 9799 1726773056.04919: dumping result to json 9799 1726773056.04921: done dumping result, returning 9799 1726773056.04925: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree [0affffe7-6841-7dd6-8fa6-0000000000be] 9799 1726773056.04930: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000000be 9799 1726773056.04947: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000000be 9799 1726773056.04949: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__kernel_settings_is_ostree": false }, "changed": false } 9733 1726773056.05120: no more pending results, returning what we have 9733 1726773056.05122: results queue empty 9733 1726773056.05123: checking for any_errors_fatal 9733 1726773056.05129: done checking for any_errors_fatal 9733 1726773056.05130: checking for max_fail_percentage 9733 1726773056.05131: done checking for max_fail_percentage 9733 1726773056.05131: checking to see if all hosts have failed and the running result is not ok 9733 1726773056.05132: done checking to see if all hosts have failed 9733 1726773056.05133: getting the remaining hosts for this loop 9733 1726773056.05134: done getting the remaining hosts for this loop 9733 1726773056.05137: getting the next task for host managed_node3 9733 1726773056.05144: done getting next task for host managed_node3 9733 1726773056.05147: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 9733 1726773056.05150: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773056.05159: getting variables 9733 1726773056.05160: in VariableManager get_vars() 9733 1726773056.05194: Calling all_inventory to load vars for managed_node3 9733 1726773056.05196: Calling groups_inventory to load vars for managed_node3 9733 1726773056.05198: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773056.05205: Calling all_plugins_play to load vars for managed_node3 9733 1726773056.05206: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773056.05208: Calling groups_plugins_play to load vars for managed_node3 9733 1726773056.05338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773056.05453: done with get_vars() 9733 1726773056.05460: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:22 Thursday 19 September 2024 15:10:56 -0400 (0:00:00.029) 0:00:01.787 **** 9733 1726773056.05526: entering _queue_task() for managed_node3/stat 9733 1726773056.05705: worker is 1 (out of 1 available) 9733 1726773056.05719: exiting _queue_task() for managed_node3/stat 9733 1726773056.05729: done queuing things up, now waiting for results queue to drain 9733 1726773056.05731: waiting for pending results... 9800 1726773056.05849: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 9800 1726773056.05976: in run() - task 0affffe7-6841-7dd6-8fa6-0000000000c0 9800 1726773056.05996: variable 'ansible_search_path' from source: unknown 9800 1726773056.06001: variable 'ansible_search_path' from source: unknown 9800 1726773056.06027: calling self._execute() 9800 1726773056.06079: variable 'ansible_host' from source: host vars for 'managed_node3' 9800 1726773056.06088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9800 1726773056.06094: variable 'omit' from source: magic vars 9800 1726773056.06463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9800 1726773056.06663: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9800 1726773056.06716: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9800 1726773056.06749: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9800 1726773056.06777: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9800 1726773056.06852: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9800 1726773056.06878: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9800 1726773056.06908: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9800 1726773056.06924: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9800 1726773056.07026: Evaluated conditional (not __kernel_settings_is_transactional is defined): True 9800 1726773056.07035: variable 'omit' from source: magic vars 9800 1726773056.07096: variable 'omit' from source: magic vars 9800 1726773056.07119: variable 'omit' from source: magic vars 9800 1726773056.07141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9800 1726773056.07169: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9800 1726773056.07193: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9800 1726773056.07210: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9800 1726773056.07218: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9800 1726773056.07240: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9800 1726773056.07245: variable 'ansible_host' from source: host vars for 'managed_node3' 9800 1726773056.07247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9800 1726773056.07329: Set connection var ansible_timeout to 10 9800 1726773056.07333: Set connection var ansible_shell_type to sh 9800 1726773056.07337: Set connection var ansible_module_compression to ZIP_DEFLATED 9800 1726773056.07340: Set connection var ansible_shell_executable to /bin/sh 9800 1726773056.07346: Set connection var ansible_pipelining to False 9800 1726773056.07350: Set connection var ansible_connection to ssh 9800 1726773056.07363: variable 'ansible_shell_executable' from source: unknown 9800 1726773056.07369: variable 'ansible_connection' from source: unknown 9800 1726773056.07372: variable 'ansible_module_compression' from source: unknown 9800 1726773056.07374: variable 'ansible_shell_type' from source: unknown 9800 1726773056.07375: variable 'ansible_shell_executable' from source: unknown 9800 1726773056.07378: variable 'ansible_host' from source: host vars for 'managed_node3' 9800 1726773056.07380: variable 'ansible_pipelining' from source: unknown 9800 1726773056.07381: variable 'ansible_timeout' from source: unknown 9800 1726773056.07383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9800 1726773056.07534: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9800 1726773056.07548: variable 'omit' from source: magic vars 9800 1726773056.07557: starting attempt loop 9800 1726773056.07560: running the handler 9800 1726773056.07573: _low_level_execute_command(): starting 9800 1726773056.07580: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9800 1726773056.10064: stdout chunk (state=2): >>>/root <<< 9800 1726773056.10188: stderr chunk (state=3): >>><<< 9800 1726773056.10197: stdout chunk (state=3): >>><<< 9800 1726773056.10215: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9800 1726773056.10227: _low_level_execute_command(): starting 9800 1726773056.10233: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773056.1022277-9800-247530632563794 `" && echo ansible-tmp-1726773056.1022277-9800-247530632563794="` echo /root/.ansible/tmp/ansible-tmp-1726773056.1022277-9800-247530632563794 `" ) && sleep 0' 9800 1726773056.12742: stdout chunk (state=2): >>>ansible-tmp-1726773056.1022277-9800-247530632563794=/root/.ansible/tmp/ansible-tmp-1726773056.1022277-9800-247530632563794 <<< 9800 1726773056.12876: stderr chunk (state=3): >>><<< 9800 1726773056.12882: stdout chunk (state=3): >>><<< 9800 1726773056.12898: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773056.1022277-9800-247530632563794=/root/.ansible/tmp/ansible-tmp-1726773056.1022277-9800-247530632563794 , stderr= 9800 1726773056.12933: variable 'ansible_module_compression' from source: unknown 9800 1726773056.12982: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 9800 1726773056.13011: variable 'ansible_facts' from source: unknown 9800 1726773056.13079: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773056.1022277-9800-247530632563794/AnsiballZ_stat.py 9800 1726773056.13179: Sending initial data 9800 1726773056.13188: Sent initial data (151 bytes) 9800 1726773056.15743: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmp_z69h_qd /root/.ansible/tmp/ansible-tmp-1726773056.1022277-9800-247530632563794/AnsiballZ_stat.py <<< 9800 1726773056.16906: stderr chunk (state=3): >>><<< 9800 1726773056.16913: stdout chunk (state=3): >>><<< 9800 1726773056.16933: done transferring module to remote 9800 1726773056.16943: _low_level_execute_command(): starting 9800 1726773056.16949: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773056.1022277-9800-247530632563794/ /root/.ansible/tmp/ansible-tmp-1726773056.1022277-9800-247530632563794/AnsiballZ_stat.py && sleep 0' 9800 1726773056.19372: stderr chunk (state=2): >>><<< 9800 1726773056.19380: stdout chunk (state=2): >>><<< 9800 1726773056.19396: _low_level_execute_command() done: rc=0, stdout=, stderr= 9800 1726773056.19401: _low_level_execute_command(): starting 9800 1726773056.19407: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773056.1022277-9800-247530632563794/AnsiballZ_stat.py && sleep 0' 9800 1726773056.34564: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/sbin/transactional-update", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 9800 1726773056.35632: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 9800 1726773056.35683: stderr chunk (state=3): >>><<< 9800 1726773056.35692: stdout chunk (state=3): >>><<< 9800 1726773056.35708: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/sbin/transactional-update", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.47.99 closed. 9800 1726773056.35736: done with _execute_module (stat, {'path': '/sbin/transactional-update', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773056.1022277-9800-247530632563794/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9800 1726773056.35746: _low_level_execute_command(): starting 9800 1726773056.35752: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773056.1022277-9800-247530632563794/ > /dev/null 2>&1 && sleep 0' 9800 1726773056.38257: stderr chunk (state=2): >>><<< 9800 1726773056.38268: stdout chunk (state=2): >>><<< 9800 1726773056.38283: _low_level_execute_command() done: rc=0, stdout=, stderr= 9800 1726773056.38292: handler run complete 9800 1726773056.38307: attempt loop complete, returning result 9800 1726773056.38310: _execute() done 9800 1726773056.38314: dumping result to json 9800 1726773056.38318: done dumping result, returning 9800 1726773056.38327: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin [0affffe7-6841-7dd6-8fa6-0000000000c0] 9800 1726773056.38333: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000000c0 9800 1726773056.38364: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000000c0 9800 1726773056.38367: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 9733 1726773056.38503: no more pending results, returning what we have 9733 1726773056.38506: results queue empty 9733 1726773056.38506: checking for any_errors_fatal 9733 1726773056.38511: done checking for any_errors_fatal 9733 1726773056.38511: checking for max_fail_percentage 9733 1726773056.38513: done checking for max_fail_percentage 9733 1726773056.38513: checking to see if all hosts have failed and the running result is not ok 9733 1726773056.38514: done checking to see if all hosts have failed 9733 1726773056.38514: getting the remaining hosts for this loop 9733 1726773056.38515: done getting the remaining hosts for this loop 9733 1726773056.38519: getting the next task for host managed_node3 9733 1726773056.38523: done getting next task for host managed_node3 9733 1726773056.38527: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 9733 1726773056.38529: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773056.38537: getting variables 9733 1726773056.38538: in VariableManager get_vars() 9733 1726773056.38572: Calling all_inventory to load vars for managed_node3 9733 1726773056.38575: Calling groups_inventory to load vars for managed_node3 9733 1726773056.38577: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773056.38587: Calling all_plugins_play to load vars for managed_node3 9733 1726773056.38590: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773056.38592: Calling groups_plugins_play to load vars for managed_node3 9733 1726773056.38710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773056.38828: done with get_vars() 9733 1726773056.38836: done getting variables 9733 1726773056.38879: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:27 Thursday 19 September 2024 15:10:56 -0400 (0:00:00.333) 0:00:02.121 **** 9733 1726773056.38906: entering _queue_task() for managed_node3/set_fact 9733 1726773056.39081: worker is 1 (out of 1 available) 9733 1726773056.39098: exiting _queue_task() for managed_node3/set_fact 9733 1726773056.39110: done queuing things up, now waiting for results queue to drain 9733 1726773056.39111: waiting for pending results... 9808 1726773056.39213: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 9808 1726773056.39327: in run() - task 0affffe7-6841-7dd6-8fa6-0000000000c1 9808 1726773056.39344: variable 'ansible_search_path' from source: unknown 9808 1726773056.39348: variable 'ansible_search_path' from source: unknown 9808 1726773056.39376: calling self._execute() 9808 1726773056.39431: variable 'ansible_host' from source: host vars for 'managed_node3' 9808 1726773056.39439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9808 1726773056.39448: variable 'omit' from source: magic vars 9808 1726773056.39828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9808 1726773056.40001: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9808 1726773056.40034: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9808 1726773056.40059: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9808 1726773056.40087: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9808 1726773056.40147: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9808 1726773056.40168: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9808 1726773056.40190: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9808 1726773056.40209: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9808 1726773056.40297: Evaluated conditional (not __kernel_settings_is_transactional is defined): True 9808 1726773056.40305: variable 'omit' from source: magic vars 9808 1726773056.40346: variable 'omit' from source: magic vars 9808 1726773056.40426: variable '__transactional_update_stat' from source: set_fact 9808 1726773056.40464: variable 'omit' from source: magic vars 9808 1726773056.40484: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9808 1726773056.40505: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9808 1726773056.40521: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9808 1726773056.40535: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9808 1726773056.40544: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9808 1726773056.40568: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9808 1726773056.40573: variable 'ansible_host' from source: host vars for 'managed_node3' 9808 1726773056.40576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9808 1726773056.40640: Set connection var ansible_timeout to 10 9808 1726773056.40644: Set connection var ansible_shell_type to sh 9808 1726773056.40648: Set connection var ansible_module_compression to ZIP_DEFLATED 9808 1726773056.40651: Set connection var ansible_shell_executable to /bin/sh 9808 1726773056.40655: Set connection var ansible_pipelining to False 9808 1726773056.40660: Set connection var ansible_connection to ssh 9808 1726773056.40673: variable 'ansible_shell_executable' from source: unknown 9808 1726773056.40676: variable 'ansible_connection' from source: unknown 9808 1726773056.40678: variable 'ansible_module_compression' from source: unknown 9808 1726773056.40680: variable 'ansible_shell_type' from source: unknown 9808 1726773056.40683: variable 'ansible_shell_executable' from source: unknown 9808 1726773056.40686: variable 'ansible_host' from source: host vars for 'managed_node3' 9808 1726773056.40688: variable 'ansible_pipelining' from source: unknown 9808 1726773056.40690: variable 'ansible_timeout' from source: unknown 9808 1726773056.40693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9808 1726773056.40765: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9808 1726773056.40775: variable 'omit' from source: magic vars 9808 1726773056.40780: starting attempt loop 9808 1726773056.40782: running the handler 9808 1726773056.40790: handler run complete 9808 1726773056.40796: attempt loop complete, returning result 9808 1726773056.40798: _execute() done 9808 1726773056.40800: dumping result to json 9808 1726773056.40802: done dumping result, returning 9808 1726773056.40806: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists [0affffe7-6841-7dd6-8fa6-0000000000c1] 9808 1726773056.40809: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000000c1 9808 1726773056.40825: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000000c1 9808 1726773056.40827: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__kernel_settings_is_transactional": false }, "changed": false } 9733 1726773056.41112: no more pending results, returning what we have 9733 1726773056.41114: results queue empty 9733 1726773056.41115: checking for any_errors_fatal 9733 1726773056.41120: done checking for any_errors_fatal 9733 1726773056.41120: checking for max_fail_percentage 9733 1726773056.41121: done checking for max_fail_percentage 9733 1726773056.41121: checking to see if all hosts have failed and the running result is not ok 9733 1726773056.41121: done checking to see if all hosts have failed 9733 1726773056.41122: getting the remaining hosts for this loop 9733 1726773056.41123: done getting the remaining hosts for this loop 9733 1726773056.41125: getting the next task for host managed_node3 9733 1726773056.41131: done getting next task for host managed_node3 9733 1726773056.41133: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 9733 1726773056.41135: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773056.41143: getting variables 9733 1726773056.41148: in VariableManager get_vars() 9733 1726773056.41172: Calling all_inventory to load vars for managed_node3 9733 1726773056.41174: Calling groups_inventory to load vars for managed_node3 9733 1726773056.41175: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773056.41181: Calling all_plugins_play to load vars for managed_node3 9733 1726773056.41182: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773056.41184: Calling groups_plugins_play to load vars for managed_node3 9733 1726773056.41317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773056.41438: done with get_vars() 9733 1726773056.41445: done getting variables 9733 1726773056.41487: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set platform/version specific variables] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:31 Thursday 19 September 2024 15:10:56 -0400 (0:00:00.026) 0:00:02.147 **** 9733 1726773056.41508: entering _queue_task() for managed_node3/include_vars 9733 1726773056.41664: worker is 1 (out of 1 available) 9733 1726773056.41680: exiting _queue_task() for managed_node3/include_vars 9733 1726773056.41692: done queuing things up, now waiting for results queue to drain 9733 1726773056.41694: waiting for pending results... 9809 1726773056.41792: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 9809 1726773056.41899: in run() - task 0affffe7-6841-7dd6-8fa6-0000000000c3 9809 1726773056.41915: variable 'ansible_search_path' from source: unknown 9809 1726773056.41920: variable 'ansible_search_path' from source: unknown 9809 1726773056.41948: calling self._execute() 9809 1726773056.42003: variable 'ansible_host' from source: host vars for 'managed_node3' 9809 1726773056.42013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9809 1726773056.42021: variable 'omit' from source: magic vars 9809 1726773056.42091: variable 'omit' from source: magic vars 9809 1726773056.42134: variable 'omit' from source: magic vars 9809 1726773056.42393: variable 'ffparams' from source: task vars 9809 1726773056.42489: variable 'ansible_facts' from source: unknown 9809 1726773056.42617: variable 'ansible_facts' from source: unknown 9809 1726773056.42707: variable 'ansible_facts' from source: unknown 9809 1726773056.42795: variable 'ansible_facts' from source: unknown 9809 1726773056.42870: variable 'role_path' from source: magic vars 9809 1726773056.42993: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 9809 1726773056.43147: Loaded config def from plugin (lookup/first_found) 9809 1726773056.43155: Loading LookupModule 'first_found' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/first_found.py 9809 1726773056.43184: variable 'ansible_search_path' from source: unknown 9809 1726773056.43207: variable 'ansible_search_path' from source: unknown 9809 1726773056.43216: variable 'ansible_search_path' from source: unknown 9809 1726773056.43223: variable 'ansible_search_path' from source: unknown 9809 1726773056.43230: variable 'ansible_search_path' from source: unknown 9809 1726773056.43244: variable 'omit' from source: magic vars 9809 1726773056.43260: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9809 1726773056.43278: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9809 1726773056.43297: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9809 1726773056.43310: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9809 1726773056.43319: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9809 1726773056.43338: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9809 1726773056.43341: variable 'ansible_host' from source: host vars for 'managed_node3' 9809 1726773056.43343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9809 1726773056.43410: Set connection var ansible_timeout to 10 9809 1726773056.43415: Set connection var ansible_shell_type to sh 9809 1726773056.43421: Set connection var ansible_module_compression to ZIP_DEFLATED 9809 1726773056.43427: Set connection var ansible_shell_executable to /bin/sh 9809 1726773056.43433: Set connection var ansible_pipelining to False 9809 1726773056.43439: Set connection var ansible_connection to ssh 9809 1726773056.43454: variable 'ansible_shell_executable' from source: unknown 9809 1726773056.43458: variable 'ansible_connection' from source: unknown 9809 1726773056.43461: variable 'ansible_module_compression' from source: unknown 9809 1726773056.43465: variable 'ansible_shell_type' from source: unknown 9809 1726773056.43468: variable 'ansible_shell_executable' from source: unknown 9809 1726773056.43471: variable 'ansible_host' from source: host vars for 'managed_node3' 9809 1726773056.43475: variable 'ansible_pipelining' from source: unknown 9809 1726773056.43478: variable 'ansible_timeout' from source: unknown 9809 1726773056.43483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9809 1726773056.43552: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9809 1726773056.43563: variable 'omit' from source: magic vars 9809 1726773056.43569: starting attempt loop 9809 1726773056.43572: running the handler 9809 1726773056.43615: handler run complete 9809 1726773056.43625: attempt loop complete, returning result 9809 1726773056.43629: _execute() done 9809 1726773056.43633: dumping result to json 9809 1726773056.43637: done dumping result, returning 9809 1726773056.43643: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables [0affffe7-6841-7dd6-8fa6-0000000000c3] 9809 1726773056.43648: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000000c3 9809 1726773056.43669: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000000c3 9809 1726773056.43671: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__kernel_settings_packages": [ "tuned", "python3-configobj" ], "__kernel_settings_services": [ "tuned" ] }, "ansible_included_var_files": [ "/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/default.yml" ], "changed": false } 9733 1726773056.43810: no more pending results, returning what we have 9733 1726773056.43812: results queue empty 9733 1726773056.43813: checking for any_errors_fatal 9733 1726773056.43817: done checking for any_errors_fatal 9733 1726773056.43817: checking for max_fail_percentage 9733 1726773056.43818: done checking for max_fail_percentage 9733 1726773056.43819: checking to see if all hosts have failed and the running result is not ok 9733 1726773056.43819: done checking to see if all hosts have failed 9733 1726773056.43820: getting the remaining hosts for this loop 9733 1726773056.43821: done getting the remaining hosts for this loop 9733 1726773056.43824: getting the next task for host managed_node3 9733 1726773056.43829: done getting next task for host managed_node3 9733 1726773056.43832: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 9733 1726773056.43834: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773056.43845: getting variables 9733 1726773056.43846: in VariableManager get_vars() 9733 1726773056.43876: Calling all_inventory to load vars for managed_node3 9733 1726773056.43879: Calling groups_inventory to load vars for managed_node3 9733 1726773056.43880: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773056.43890: Calling all_plugins_play to load vars for managed_node3 9733 1726773056.43892: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773056.43894: Calling groups_plugins_play to load vars for managed_node3 9733 1726773056.44013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773056.44131: done with get_vars() 9733 1726773056.44138: done getting variables 9733 1726773056.44209: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required packages are installed] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 Thursday 19 September 2024 15:10:56 -0400 (0:00:00.027) 0:00:02.174 **** 9733 1726773056.44229: entering _queue_task() for managed_node3/package 9733 1726773056.44230: Creating lock for package 9733 1726773056.44399: worker is 1 (out of 1 available) 9733 1726773056.44413: exiting _queue_task() for managed_node3/package 9733 1726773056.44425: done queuing things up, now waiting for results queue to drain 9733 1726773056.44427: waiting for pending results... 9810 1726773056.44529: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 9810 1726773056.44630: in run() - task 0affffe7-6841-7dd6-8fa6-000000000049 9810 1726773056.44646: variable 'ansible_search_path' from source: unknown 9810 1726773056.44650: variable 'ansible_search_path' from source: unknown 9810 1726773056.44679: calling self._execute() 9810 1726773056.44734: variable 'ansible_host' from source: host vars for 'managed_node3' 9810 1726773056.44743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9810 1726773056.44751: variable 'omit' from source: magic vars 9810 1726773056.44826: variable 'omit' from source: magic vars 9810 1726773056.44922: variable 'omit' from source: magic vars 9810 1726773056.44945: variable '__kernel_settings_packages' from source: include_vars 9810 1726773056.45151: variable '__kernel_settings_packages' from source: include_vars 9810 1726773056.45306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9810 1726773056.46780: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9810 1726773056.46837: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9810 1726773056.46865: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9810 1726773056.46898: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9810 1726773056.46920: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9810 1726773056.46992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9810 1726773056.47013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9810 1726773056.47032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9810 1726773056.47062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9810 1726773056.47075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9810 1726773056.47151: variable '__kernel_settings_is_ostree' from source: set_fact 9810 1726773056.47158: variable 'omit' from source: magic vars 9810 1726773056.47183: variable 'omit' from source: magic vars 9810 1726773056.47207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9810 1726773056.47226: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9810 1726773056.47242: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9810 1726773056.47256: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9810 1726773056.47265: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9810 1726773056.47292: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9810 1726773056.47297: variable 'ansible_host' from source: host vars for 'managed_node3' 9810 1726773056.47301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9810 1726773056.47367: Set connection var ansible_timeout to 10 9810 1726773056.47372: Set connection var ansible_shell_type to sh 9810 1726773056.47379: Set connection var ansible_module_compression to ZIP_DEFLATED 9810 1726773056.47386: Set connection var ansible_shell_executable to /bin/sh 9810 1726773056.47393: Set connection var ansible_pipelining to False 9810 1726773056.47400: Set connection var ansible_connection to ssh 9810 1726773056.47417: variable 'ansible_shell_executable' from source: unknown 9810 1726773056.47421: variable 'ansible_connection' from source: unknown 9810 1726773056.47424: variable 'ansible_module_compression' from source: unknown 9810 1726773056.47427: variable 'ansible_shell_type' from source: unknown 9810 1726773056.47430: variable 'ansible_shell_executable' from source: unknown 9810 1726773056.47434: variable 'ansible_host' from source: host vars for 'managed_node3' 9810 1726773056.47438: variable 'ansible_pipelining' from source: unknown 9810 1726773056.47441: variable 'ansible_timeout' from source: unknown 9810 1726773056.47446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9810 1726773056.47511: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9810 1726773056.47523: variable 'omit' from source: magic vars 9810 1726773056.47529: starting attempt loop 9810 1726773056.47533: running the handler 9810 1726773056.47594: variable 'ansible_facts' from source: unknown 9810 1726773056.47674: _low_level_execute_command(): starting 9810 1726773056.47683: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9810 1726773056.50069: stdout chunk (state=2): >>>/root <<< 9810 1726773056.50198: stderr chunk (state=3): >>><<< 9810 1726773056.50204: stdout chunk (state=3): >>><<< 9810 1726773056.50222: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9810 1726773056.50232: _low_level_execute_command(): starting 9810 1726773056.50238: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773056.50229-9810-104198073826655 `" && echo ansible-tmp-1726773056.50229-9810-104198073826655="` echo /root/.ansible/tmp/ansible-tmp-1726773056.50229-9810-104198073826655 `" ) && sleep 0' 9810 1726773056.52777: stdout chunk (state=2): >>>ansible-tmp-1726773056.50229-9810-104198073826655=/root/.ansible/tmp/ansible-tmp-1726773056.50229-9810-104198073826655 <<< 9810 1726773056.52905: stderr chunk (state=3): >>><<< 9810 1726773056.52911: stdout chunk (state=3): >>><<< 9810 1726773056.52925: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773056.50229-9810-104198073826655=/root/.ansible/tmp/ansible-tmp-1726773056.50229-9810-104198073826655 , stderr= 9810 1726773056.52951: variable 'ansible_module_compression' from source: unknown 9810 1726773056.52997: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 9810 1726773056.53002: ANSIBALLZ: Acquiring lock 9810 1726773056.53006: ANSIBALLZ: Lock acquired: 139792132305312 9810 1726773056.53009: ANSIBALLZ: Creating module 9810 1726773056.65238: ANSIBALLZ: Writing module into payload 9810 1726773056.65437: ANSIBALLZ: Writing module 9810 1726773056.65460: ANSIBALLZ: Renaming module 9810 1726773056.65467: ANSIBALLZ: Done creating module 9810 1726773056.65483: variable 'ansible_facts' from source: unknown 9810 1726773056.65560: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773056.50229-9810-104198073826655/AnsiballZ_dnf.py 9810 1726773056.65673: Sending initial data 9810 1726773056.65680: Sent initial data (148 bytes) 9810 1726773056.68350: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmp2d_lsxzz /root/.ansible/tmp/ansible-tmp-1726773056.50229-9810-104198073826655/AnsiballZ_dnf.py <<< 9810 1726773056.69839: stderr chunk (state=3): >>><<< 9810 1726773056.69848: stdout chunk (state=3): >>><<< 9810 1726773056.69868: done transferring module to remote 9810 1726773056.69879: _low_level_execute_command(): starting 9810 1726773056.69886: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773056.50229-9810-104198073826655/ /root/.ansible/tmp/ansible-tmp-1726773056.50229-9810-104198073826655/AnsiballZ_dnf.py && sleep 0' 9810 1726773056.72337: stderr chunk (state=2): >>><<< 9810 1726773056.72347: stdout chunk (state=2): >>><<< 9810 1726773056.72361: _low_level_execute_command() done: rc=0, stdout=, stderr= 9810 1726773056.72365: _low_level_execute_command(): starting 9810 1726773056.72372: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773056.50229-9810-104198073826655/AnsiballZ_dnf.py && sleep 0' 9810 1726773059.23162: stdout chunk (state=2): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} <<< 9810 1726773059.30773: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 9810 1726773059.30820: stderr chunk (state=3): >>><<< 9810 1726773059.30827: stdout chunk (state=3): >>><<< 9810 1726773059.30846: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} , stderr=Shared connection to 10.31.47.99 closed. 9810 1726773059.30882: done with _execute_module (ansible.legacy.dnf, {'name': ['tuned', 'python3-configobj'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773056.50229-9810-104198073826655/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9810 1726773059.30893: _low_level_execute_command(): starting 9810 1726773059.30899: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773056.50229-9810-104198073826655/ > /dev/null 2>&1 && sleep 0' 9810 1726773059.33374: stderr chunk (state=2): >>><<< 9810 1726773059.33386: stdout chunk (state=2): >>><<< 9810 1726773059.33402: _low_level_execute_command() done: rc=0, stdout=, stderr= 9810 1726773059.33410: handler run complete 9810 1726773059.33438: attempt loop complete, returning result 9810 1726773059.33442: _execute() done 9810 1726773059.33446: dumping result to json 9810 1726773059.33452: done dumping result, returning 9810 1726773059.33459: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed [0affffe7-6841-7dd6-8fa6-000000000049] 9810 1726773059.33464: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000049 9810 1726773059.33497: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000049 9810 1726773059.33501: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 9733 1726773059.33655: no more pending results, returning what we have 9733 1726773059.33658: results queue empty 9733 1726773059.33659: checking for any_errors_fatal 9733 1726773059.33665: done checking for any_errors_fatal 9733 1726773059.33665: checking for max_fail_percentage 9733 1726773059.33666: done checking for max_fail_percentage 9733 1726773059.33667: checking to see if all hosts have failed and the running result is not ok 9733 1726773059.33668: done checking to see if all hosts have failed 9733 1726773059.33671: getting the remaining hosts for this loop 9733 1726773059.33672: done getting the remaining hosts for this loop 9733 1726773059.33675: getting the next task for host managed_node3 9733 1726773059.33681: done getting next task for host managed_node3 9733 1726773059.33684: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 9733 1726773059.33687: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773059.33696: getting variables 9733 1726773059.33697: in VariableManager get_vars() 9733 1726773059.33726: Calling all_inventory to load vars for managed_node3 9733 1726773059.33729: Calling groups_inventory to load vars for managed_node3 9733 1726773059.33731: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773059.33740: Calling all_plugins_play to load vars for managed_node3 9733 1726773059.33743: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773059.33745: Calling groups_plugins_play to load vars for managed_node3 9733 1726773059.33919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773059.34034: done with get_vars() 9733 1726773059.34042: done getting variables 9733 1726773059.34115: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:24 Thursday 19 September 2024 15:10:59 -0400 (0:00:02.899) 0:00:05.073 **** 9733 1726773059.34139: entering _queue_task() for managed_node3/debug 9733 1726773059.34140: Creating lock for debug 9733 1726773059.34317: worker is 1 (out of 1 available) 9733 1726773059.34331: exiting _queue_task() for managed_node3/debug 9733 1726773059.34343: done queuing things up, now waiting for results queue to drain 9733 1726773059.34345: waiting for pending results... 9863 1726773059.34453: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 9863 1726773059.34557: in run() - task 0affffe7-6841-7dd6-8fa6-00000000004b 9863 1726773059.34573: variable 'ansible_search_path' from source: unknown 9863 1726773059.34578: variable 'ansible_search_path' from source: unknown 9863 1726773059.34608: calling self._execute() 9863 1726773059.34666: variable 'ansible_host' from source: host vars for 'managed_node3' 9863 1726773059.34675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9863 1726773059.34686: variable 'omit' from source: magic vars 9863 1726773059.35020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9863 1726773059.36552: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9863 1726773059.36605: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9863 1726773059.36634: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9863 1726773059.36667: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9863 1726773059.36696: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9863 1726773059.36755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9863 1726773059.36778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9863 1726773059.36800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9863 1726773059.36827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9863 1726773059.36839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9863 1726773059.36921: variable '__kernel_settings_is_transactional' from source: set_fact 9863 1726773059.36937: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 9863 1726773059.36942: when evaluation is False, skipping this task 9863 1726773059.36945: _execute() done 9863 1726773059.36949: dumping result to json 9863 1726773059.36953: done dumping result, returning 9863 1726773059.36960: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes [0affffe7-6841-7dd6-8fa6-00000000004b] 9863 1726773059.36965: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000004b 9863 1726773059.36990: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000004b 9863 1726773059.36992: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "__kernel_settings_is_transactional | d(false)" } 9733 1726773059.37250: no more pending results, returning what we have 9733 1726773059.37252: results queue empty 9733 1726773059.37253: checking for any_errors_fatal 9733 1726773059.37258: done checking for any_errors_fatal 9733 1726773059.37258: checking for max_fail_percentage 9733 1726773059.37259: done checking for max_fail_percentage 9733 1726773059.37259: checking to see if all hosts have failed and the running result is not ok 9733 1726773059.37260: done checking to see if all hosts have failed 9733 1726773059.37260: getting the remaining hosts for this loop 9733 1726773059.37261: done getting the remaining hosts for this loop 9733 1726773059.37263: getting the next task for host managed_node3 9733 1726773059.37267: done getting next task for host managed_node3 9733 1726773059.37270: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 9733 1726773059.37272: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773059.37281: getting variables 9733 1726773059.37282: in VariableManager get_vars() 9733 1726773059.37308: Calling all_inventory to load vars for managed_node3 9733 1726773059.37310: Calling groups_inventory to load vars for managed_node3 9733 1726773059.37311: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773059.37318: Calling all_plugins_play to load vars for managed_node3 9733 1726773059.37319: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773059.37321: Calling groups_plugins_play to load vars for managed_node3 9733 1726773059.37424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773059.37549: done with get_vars() 9733 1726773059.37559: done getting variables 9733 1726773059.37656: Loading ActionModule 'reboot' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Reboot transactional update systems] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:29 Thursday 19 September 2024 15:10:59 -0400 (0:00:00.035) 0:00:05.109 **** 9733 1726773059.37678: entering _queue_task() for managed_node3/reboot 9733 1726773059.37680: Creating lock for reboot 9733 1726773059.37902: worker is 1 (out of 1 available) 9733 1726773059.37917: exiting _queue_task() for managed_node3/reboot 9733 1726773059.37929: done queuing things up, now waiting for results queue to drain 9733 1726773059.37931: waiting for pending results... 9864 1726773059.38051: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 9864 1726773059.38164: in run() - task 0affffe7-6841-7dd6-8fa6-00000000004c 9864 1726773059.38183: variable 'ansible_search_path' from source: unknown 9864 1726773059.38189: variable 'ansible_search_path' from source: unknown 9864 1726773059.38225: calling self._execute() 9864 1726773059.38308: variable 'ansible_host' from source: host vars for 'managed_node3' 9864 1726773059.38317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9864 1726773059.38325: variable 'omit' from source: magic vars 9864 1726773059.38666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9864 1726773059.40351: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9864 1726773059.40422: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9864 1726773059.40459: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9864 1726773059.40497: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9864 1726773059.40522: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9864 1726773059.40597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9864 1726773059.40624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9864 1726773059.40647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9864 1726773059.40689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9864 1726773059.40703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9864 1726773059.40812: variable '__kernel_settings_is_transactional' from source: set_fact 9864 1726773059.40831: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 9864 1726773059.40840: when evaluation is False, skipping this task 9864 1726773059.40842: _execute() done 9864 1726773059.40844: dumping result to json 9864 1726773059.40847: done dumping result, returning 9864 1726773059.40851: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems [0affffe7-6841-7dd6-8fa6-00000000004c] 9864 1726773059.40855: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000004c 9864 1726773059.40877: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000004c 9864 1726773059.40879: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 9733 1726773059.41036: no more pending results, returning what we have 9733 1726773059.41039: results queue empty 9733 1726773059.41039: checking for any_errors_fatal 9733 1726773059.41044: done checking for any_errors_fatal 9733 1726773059.41044: checking for max_fail_percentage 9733 1726773059.41045: done checking for max_fail_percentage 9733 1726773059.41046: checking to see if all hosts have failed and the running result is not ok 9733 1726773059.41047: done checking to see if all hosts have failed 9733 1726773059.41047: getting the remaining hosts for this loop 9733 1726773059.41048: done getting the remaining hosts for this loop 9733 1726773059.41051: getting the next task for host managed_node3 9733 1726773059.41058: done getting next task for host managed_node3 9733 1726773059.41061: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 9733 1726773059.41063: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773059.41075: getting variables 9733 1726773059.41076: in VariableManager get_vars() 9733 1726773059.41109: Calling all_inventory to load vars for managed_node3 9733 1726773059.41112: Calling groups_inventory to load vars for managed_node3 9733 1726773059.41114: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773059.41122: Calling all_plugins_play to load vars for managed_node3 9733 1726773059.41124: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773059.41126: Calling groups_plugins_play to load vars for managed_node3 9733 1726773059.41246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773059.41360: done with get_vars() 9733 1726773059.41371: done getting variables 9733 1726773059.41419: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:34 Thursday 19 September 2024 15:10:59 -0400 (0:00:00.037) 0:00:05.146 **** 9733 1726773059.41440: entering _queue_task() for managed_node3/fail 9733 1726773059.41614: worker is 1 (out of 1 available) 9733 1726773059.41628: exiting _queue_task() for managed_node3/fail 9733 1726773059.41639: done queuing things up, now waiting for results queue to drain 9733 1726773059.41642: waiting for pending results... 9866 1726773059.41752: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 9866 1726773059.41856: in run() - task 0affffe7-6841-7dd6-8fa6-00000000004d 9866 1726773059.41874: variable 'ansible_search_path' from source: unknown 9866 1726773059.41879: variable 'ansible_search_path' from source: unknown 9866 1726773059.41907: calling self._execute() 9866 1726773059.41965: variable 'ansible_host' from source: host vars for 'managed_node3' 9866 1726773059.41976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9866 1726773059.41987: variable 'omit' from source: magic vars 9866 1726773059.42316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9866 1726773059.44423: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9866 1726773059.44488: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9866 1726773059.44535: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9866 1726773059.44567: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9866 1726773059.44594: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9866 1726773059.44663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9866 1726773059.44693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9866 1726773059.44717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9866 1726773059.44757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9866 1726773059.44771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9866 1726773059.44873: variable '__kernel_settings_is_transactional' from source: set_fact 9866 1726773059.44894: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 9866 1726773059.44899: when evaluation is False, skipping this task 9866 1726773059.44903: _execute() done 9866 1726773059.44906: dumping result to json 9866 1726773059.44910: done dumping result, returning 9866 1726773059.44916: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set [0affffe7-6841-7dd6-8fa6-00000000004d] 9866 1726773059.44922: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000004d 9866 1726773059.44947: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000004d 9866 1726773059.44949: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 9733 1726773059.45142: no more pending results, returning what we have 9733 1726773059.45144: results queue empty 9733 1726773059.45145: checking for any_errors_fatal 9733 1726773059.45150: done checking for any_errors_fatal 9733 1726773059.45151: checking for max_fail_percentage 9733 1726773059.45152: done checking for max_fail_percentage 9733 1726773059.45153: checking to see if all hosts have failed and the running result is not ok 9733 1726773059.45153: done checking to see if all hosts have failed 9733 1726773059.45154: getting the remaining hosts for this loop 9733 1726773059.45155: done getting the remaining hosts for this loop 9733 1726773059.45158: getting the next task for host managed_node3 9733 1726773059.45165: done getting next task for host managed_node3 9733 1726773059.45168: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 9733 1726773059.45172: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773059.45183: getting variables 9733 1726773059.45187: in VariableManager get_vars() 9733 1726773059.45218: Calling all_inventory to load vars for managed_node3 9733 1726773059.45221: Calling groups_inventory to load vars for managed_node3 9733 1726773059.45223: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773059.45231: Calling all_plugins_play to load vars for managed_node3 9733 1726773059.45233: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773059.45236: Calling groups_plugins_play to load vars for managed_node3 9733 1726773059.45400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773059.45642: done with get_vars() 9733 1726773059.45654: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Read tuned main config] ****** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42 Thursday 19 September 2024 15:10:59 -0400 (0:00:00.042) 0:00:05.189 **** 9733 1726773059.45743: entering _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config 9733 1726773059.45744: Creating lock for fedora.linux_system_roles.kernel_settings_get_config 9733 1726773059.45981: worker is 1 (out of 1 available) 9733 1726773059.45998: exiting _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config 9733 1726773059.46011: done queuing things up, now waiting for results queue to drain 9733 1726773059.46013: waiting for pending results... 9868 1726773059.46276: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 9868 1726773059.46411: in run() - task 0affffe7-6841-7dd6-8fa6-00000000004f 9868 1726773059.46430: variable 'ansible_search_path' from source: unknown 9868 1726773059.46435: variable 'ansible_search_path' from source: unknown 9868 1726773059.46472: calling self._execute() 9868 1726773059.46549: variable 'ansible_host' from source: host vars for 'managed_node3' 9868 1726773059.46558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9868 1726773059.46566: variable 'omit' from source: magic vars 9868 1726773059.46671: variable 'omit' from source: magic vars 9868 1726773059.46717: variable 'omit' from source: magic vars 9868 1726773059.46748: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 9868 1726773059.47040: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 9868 1726773059.47198: variable '__kernel_settings_tuned_dir' from source: role '' all vars 9868 1726773059.47236: variable 'omit' from source: magic vars 9868 1726773059.47279: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9868 1726773059.47317: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9868 1726773059.47339: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9868 1726773059.47355: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9868 1726773059.47367: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9868 1726773059.47400: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9868 1726773059.47405: variable 'ansible_host' from source: host vars for 'managed_node3' 9868 1726773059.47409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9868 1726773059.47506: Set connection var ansible_timeout to 10 9868 1726773059.47512: Set connection var ansible_shell_type to sh 9868 1726773059.47517: Set connection var ansible_module_compression to ZIP_DEFLATED 9868 1726773059.47523: Set connection var ansible_shell_executable to /bin/sh 9868 1726773059.47528: Set connection var ansible_pipelining to False 9868 1726773059.47535: Set connection var ansible_connection to ssh 9868 1726773059.47554: variable 'ansible_shell_executable' from source: unknown 9868 1726773059.47557: variable 'ansible_connection' from source: unknown 9868 1726773059.47560: variable 'ansible_module_compression' from source: unknown 9868 1726773059.47563: variable 'ansible_shell_type' from source: unknown 9868 1726773059.47566: variable 'ansible_shell_executable' from source: unknown 9868 1726773059.47571: variable 'ansible_host' from source: host vars for 'managed_node3' 9868 1726773059.47575: variable 'ansible_pipelining' from source: unknown 9868 1726773059.47578: variable 'ansible_timeout' from source: unknown 9868 1726773059.47581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9868 1726773059.47749: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9868 1726773059.47757: variable 'omit' from source: magic vars 9868 1726773059.47762: starting attempt loop 9868 1726773059.47764: running the handler 9868 1726773059.47776: _low_level_execute_command(): starting 9868 1726773059.47782: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9868 1726773059.50170: stdout chunk (state=2): >>>/root <<< 9868 1726773059.50311: stderr chunk (state=3): >>><<< 9868 1726773059.50319: stdout chunk (state=3): >>><<< 9868 1726773059.50339: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9868 1726773059.50357: _low_level_execute_command(): starting 9868 1726773059.50364: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773059.503498-9868-214973745811674 `" && echo ansible-tmp-1726773059.503498-9868-214973745811674="` echo /root/.ansible/tmp/ansible-tmp-1726773059.503498-9868-214973745811674 `" ) && sleep 0' 9868 1726773059.53070: stdout chunk (state=2): >>>ansible-tmp-1726773059.503498-9868-214973745811674=/root/.ansible/tmp/ansible-tmp-1726773059.503498-9868-214973745811674 <<< 9868 1726773059.53232: stderr chunk (state=3): >>><<< 9868 1726773059.53239: stdout chunk (state=3): >>><<< 9868 1726773059.53256: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773059.503498-9868-214973745811674=/root/.ansible/tmp/ansible-tmp-1726773059.503498-9868-214973745811674 , stderr= 9868 1726773059.53304: variable 'ansible_module_compression' from source: unknown 9868 1726773059.53344: ANSIBALLZ: Using lock for fedora.linux_system_roles.kernel_settings_get_config 9868 1726773059.53349: ANSIBALLZ: Acquiring lock 9868 1726773059.53353: ANSIBALLZ: Lock acquired: 139792132803904 9868 1726773059.53356: ANSIBALLZ: Creating module 9868 1726773059.63463: ANSIBALLZ: Writing module into payload 9868 1726773059.63528: ANSIBALLZ: Writing module 9868 1726773059.63551: ANSIBALLZ: Renaming module 9868 1726773059.63558: ANSIBALLZ: Done creating module 9868 1726773059.63578: variable 'ansible_facts' from source: unknown 9868 1726773059.63636: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773059.503498-9868-214973745811674/AnsiballZ_kernel_settings_get_config.py 9868 1726773059.63738: Sending initial data 9868 1726773059.63746: Sent initial data (172 bytes) 9868 1726773059.66389: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpqnkfnyld /root/.ansible/tmp/ansible-tmp-1726773059.503498-9868-214973745811674/AnsiballZ_kernel_settings_get_config.py <<< 9868 1726773059.67549: stderr chunk (state=3): >>><<< 9868 1726773059.67557: stdout chunk (state=3): >>><<< 9868 1726773059.67576: done transferring module to remote 9868 1726773059.67589: _low_level_execute_command(): starting 9868 1726773059.67595: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773059.503498-9868-214973745811674/ /root/.ansible/tmp/ansible-tmp-1726773059.503498-9868-214973745811674/AnsiballZ_kernel_settings_get_config.py && sleep 0' 9868 1726773059.70033: stderr chunk (state=2): >>><<< 9868 1726773059.70041: stdout chunk (state=2): >>><<< 9868 1726773059.70057: _low_level_execute_command() done: rc=0, stdout=, stderr= 9868 1726773059.70062: _low_level_execute_command(): starting 9868 1726773059.70067: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773059.503498-9868-214973745811674/AnsiballZ_kernel_settings_get_config.py && sleep 0' 9868 1726773059.85689: stdout chunk (state=2): >>> {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} <<< 9868 1726773059.86835: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 9868 1726773059.86846: stdout chunk (state=3): >>><<< 9868 1726773059.86858: stderr chunk (state=3): >>><<< 9868 1726773059.86875: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} , stderr=Shared connection to 10.31.47.99 closed. 9868 1726773059.86909: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/tuned-main.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773059.503498-9868-214973745811674/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9868 1726773059.86922: _low_level_execute_command(): starting 9868 1726773059.86928: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773059.503498-9868-214973745811674/ > /dev/null 2>&1 && sleep 0' 9868 1726773059.89845: stderr chunk (state=2): >>><<< 9868 1726773059.89856: stdout chunk (state=2): >>><<< 9868 1726773059.89873: _low_level_execute_command() done: rc=0, stdout=, stderr= 9868 1726773059.89881: handler run complete 9868 1726773059.89899: attempt loop complete, returning result 9868 1726773059.89903: _execute() done 9868 1726773059.89907: dumping result to json 9868 1726773059.89911: done dumping result, returning 9868 1726773059.89920: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config [0affffe7-6841-7dd6-8fa6-00000000004f] 9868 1726773059.89925: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000004f 9868 1726773059.89962: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000004f 9868 1726773059.89966: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "data": { "daemon": "1", "default_instance_priority": "0", "dynamic_tuning": "0", "log_file_count": "2", "log_file_max_size": "1MB", "reapply_sysctl": "1", "recommend_command": "1", "sleep_interval": "1", "udev_buffer_size": "1MB", "update_interval": "10" } } 9733 1726773059.90358: no more pending results, returning what we have 9733 1726773059.90362: results queue empty 9733 1726773059.90362: checking for any_errors_fatal 9733 1726773059.90371: done checking for any_errors_fatal 9733 1726773059.90371: checking for max_fail_percentage 9733 1726773059.90373: done checking for max_fail_percentage 9733 1726773059.90373: checking to see if all hosts have failed and the running result is not ok 9733 1726773059.90374: done checking to see if all hosts have failed 9733 1726773059.90374: getting the remaining hosts for this loop 9733 1726773059.90375: done getting the remaining hosts for this loop 9733 1726773059.90379: getting the next task for host managed_node3 9733 1726773059.90384: done getting next task for host managed_node3 9733 1726773059.90389: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 9733 1726773059.90391: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773059.90400: getting variables 9733 1726773059.90402: in VariableManager get_vars() 9733 1726773059.90433: Calling all_inventory to load vars for managed_node3 9733 1726773059.90436: Calling groups_inventory to load vars for managed_node3 9733 1726773059.90438: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773059.90446: Calling all_plugins_play to load vars for managed_node3 9733 1726773059.90449: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773059.90451: Calling groups_plugins_play to load vars for managed_node3 9733 1726773059.90615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773059.90775: done with get_vars() 9733 1726773059.90784: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 Thursday 19 September 2024 15:10:59 -0400 (0:00:00.451) 0:00:05.641 **** 9733 1726773059.90861: entering _queue_task() for managed_node3/stat 9733 1726773059.91047: worker is 1 (out of 1 available) 9733 1726773059.91060: exiting _queue_task() for managed_node3/stat 9733 1726773059.91074: done queuing things up, now waiting for results queue to drain 9733 1726773059.91076: waiting for pending results... 9892 1726773059.91198: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 9892 1726773059.91313: in run() - task 0affffe7-6841-7dd6-8fa6-000000000050 9892 1726773059.91330: variable 'ansible_search_path' from source: unknown 9892 1726773059.91334: variable 'ansible_search_path' from source: unknown 9892 1726773059.91374: variable '__prof_from_conf' from source: task vars 9892 1726773059.91607: variable '__prof_from_conf' from source: task vars 9892 1726773059.91740: variable '__data' from source: task vars 9892 1726773059.91796: variable '__kernel_settings_register_tuned_main' from source: set_fact 9892 1726773059.91934: variable '__kernel_settings_tuned_dir' from source: role '' all vars 9892 1726773059.91945: variable '__kernel_settings_tuned_dir' from source: role '' all vars 9892 1726773059.92045: variable '__kernel_settings_tuned_dir' from source: role '' all vars 9892 1726773059.92064: variable 'omit' from source: magic vars 9892 1726773059.92133: variable 'ansible_host' from source: host vars for 'managed_node3' 9892 1726773059.92143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9892 1726773059.92152: variable 'omit' from source: magic vars 9892 1726773059.92324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9892 1726773059.93946: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9892 1726773059.94017: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9892 1726773059.94051: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9892 1726773059.94091: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9892 1726773059.94117: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9892 1726773059.94196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9892 1726773059.94225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9892 1726773059.94248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9892 1726773059.94318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9892 1726773059.94333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9892 1726773059.94436: variable 'item' from source: unknown 9892 1726773059.94450: Evaluated conditional (item | length > 0): False 9892 1726773059.94454: when evaluation is False, skipping this task 9892 1726773059.94488: variable 'item' from source: unknown 9892 1726773059.94560: variable 'item' from source: unknown skipping: [managed_node3] => (item=) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item | length > 0", "item": "", "skip_reason": "Conditional result was False" } 9892 1726773059.94653: variable 'ansible_host' from source: host vars for 'managed_node3' 9892 1726773059.94663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9892 1726773059.94675: variable 'omit' from source: magic vars 9892 1726773059.94806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9892 1726773059.94825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9892 1726773059.94842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9892 1726773059.94880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9892 1726773059.94892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9892 1726773059.94942: variable 'item' from source: unknown 9892 1726773059.94949: Evaluated conditional (item | length > 0): True 9892 1726773059.94955: variable 'omit' from source: magic vars 9892 1726773059.94992: variable 'omit' from source: magic vars 9892 1726773059.95019: variable 'item' from source: unknown 9892 1726773059.95063: variable 'item' from source: unknown 9892 1726773059.95078: variable 'omit' from source: magic vars 9892 1726773059.95097: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9892 1726773059.95113: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9892 1726773059.95125: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9892 1726773059.95136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9892 1726773059.95143: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9892 1726773059.95163: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9892 1726773059.95166: variable 'ansible_host' from source: host vars for 'managed_node3' 9892 1726773059.95171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9892 1726773059.95248: Set connection var ansible_timeout to 10 9892 1726773059.95253: Set connection var ansible_shell_type to sh 9892 1726773059.95259: Set connection var ansible_module_compression to ZIP_DEFLATED 9892 1726773059.95264: Set connection var ansible_shell_executable to /bin/sh 9892 1726773059.95272: Set connection var ansible_pipelining to False 9892 1726773059.95279: Set connection var ansible_connection to ssh 9892 1726773059.95295: variable 'ansible_shell_executable' from source: unknown 9892 1726773059.95310: variable 'ansible_connection' from source: unknown 9892 1726773059.95313: variable 'ansible_module_compression' from source: unknown 9892 1726773059.95314: variable 'ansible_shell_type' from source: unknown 9892 1726773059.95316: variable 'ansible_shell_executable' from source: unknown 9892 1726773059.95318: variable 'ansible_host' from source: host vars for 'managed_node3' 9892 1726773059.95320: variable 'ansible_pipelining' from source: unknown 9892 1726773059.95321: variable 'ansible_timeout' from source: unknown 9892 1726773059.95324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9892 1726773059.95487: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9892 1726773059.95498: variable 'omit' from source: magic vars 9892 1726773059.95504: starting attempt loop 9892 1726773059.95508: running the handler 9892 1726773059.95519: _low_level_execute_command(): starting 9892 1726773059.95527: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9892 1726773059.98100: stdout chunk (state=2): >>>/root <<< 9892 1726773059.98227: stderr chunk (state=3): >>><<< 9892 1726773059.98239: stdout chunk (state=3): >>><<< 9892 1726773059.98258: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9892 1726773059.98271: _low_level_execute_command(): starting 9892 1726773059.98278: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773059.9826553-9892-4583043093403 `" && echo ansible-tmp-1726773059.9826553-9892-4583043093403="` echo /root/.ansible/tmp/ansible-tmp-1726773059.9826553-9892-4583043093403 `" ) && sleep 0' 9892 1726773060.01095: stdout chunk (state=2): >>>ansible-tmp-1726773059.9826553-9892-4583043093403=/root/.ansible/tmp/ansible-tmp-1726773059.9826553-9892-4583043093403 <<< 9892 1726773060.01234: stderr chunk (state=3): >>><<< 9892 1726773060.01242: stdout chunk (state=3): >>><<< 9892 1726773060.01259: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773059.9826553-9892-4583043093403=/root/.ansible/tmp/ansible-tmp-1726773059.9826553-9892-4583043093403 , stderr= 9892 1726773060.01304: variable 'ansible_module_compression' from source: unknown 9892 1726773060.01345: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 9892 1726773060.01377: variable 'ansible_facts' from source: unknown 9892 1726773060.01445: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773059.9826553-9892-4583043093403/AnsiballZ_stat.py 9892 1726773060.01555: Sending initial data 9892 1726773060.01562: Sent initial data (149 bytes) 9892 1726773060.04221: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpemg87fjf /root/.ansible/tmp/ansible-tmp-1726773059.9826553-9892-4583043093403/AnsiballZ_stat.py <<< 9892 1726773060.05395: stderr chunk (state=3): >>><<< 9892 1726773060.05404: stdout chunk (state=3): >>><<< 9892 1726773060.05424: done transferring module to remote 9892 1726773060.05434: _low_level_execute_command(): starting 9892 1726773060.05440: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773059.9826553-9892-4583043093403/ /root/.ansible/tmp/ansible-tmp-1726773059.9826553-9892-4583043093403/AnsiballZ_stat.py && sleep 0' 9892 1726773060.07932: stderr chunk (state=2): >>><<< 9892 1726773060.07945: stdout chunk (state=2): >>><<< 9892 1726773060.07962: _low_level_execute_command() done: rc=0, stdout=, stderr= 9892 1726773060.07966: _low_level_execute_command(): starting 9892 1726773060.07975: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773059.9826553-9892-4583043093403/AnsiballZ_stat.py && sleep 0' 9892 1726773060.23091: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 9892 1726773060.24192: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 9892 1726773060.24231: stderr chunk (state=3): >>><<< 9892 1726773060.24239: stdout chunk (state=3): >>><<< 9892 1726773060.24255: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.47.99 closed. 9892 1726773060.24275: done with _execute_module (stat, {'path': '/etc/tuned/profiles', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773059.9826553-9892-4583043093403/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9892 1726773060.24288: _low_level_execute_command(): starting 9892 1726773060.24295: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773059.9826553-9892-4583043093403/ > /dev/null 2>&1 && sleep 0' 9892 1726773060.26780: stderr chunk (state=2): >>><<< 9892 1726773060.26790: stdout chunk (state=2): >>><<< 9892 1726773060.26805: _low_level_execute_command() done: rc=0, stdout=, stderr= 9892 1726773060.26812: handler run complete 9892 1726773060.26827: attempt loop complete, returning result 9892 1726773060.26842: variable 'item' from source: unknown 9892 1726773060.26908: variable 'item' from source: unknown ok: [managed_node3] => (item=/etc/tuned/profiles) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned/profiles", "stat": { "exists": false } } 9892 1726773060.26997: variable 'ansible_host' from source: host vars for 'managed_node3' 9892 1726773060.27008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9892 1726773060.27018: variable 'omit' from source: magic vars 9892 1726773060.27123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9892 1726773060.27147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9892 1726773060.27165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9892 1726773060.27196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9892 1726773060.27208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9892 1726773060.27265: variable 'item' from source: unknown 9892 1726773060.27277: Evaluated conditional (item | length > 0): True 9892 1726773060.27283: variable 'omit' from source: magic vars 9892 1726773060.27298: variable 'omit' from source: magic vars 9892 1726773060.27327: variable 'item' from source: unknown 9892 1726773060.27372: variable 'item' from source: unknown 9892 1726773060.27388: variable 'omit' from source: magic vars 9892 1726773060.27405: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9892 1726773060.27414: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9892 1726773060.27420: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9892 1726773060.27432: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9892 1726773060.27436: variable 'ansible_host' from source: host vars for 'managed_node3' 9892 1726773060.27440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9892 1726773060.27492: Set connection var ansible_timeout to 10 9892 1726773060.27496: Set connection var ansible_shell_type to sh 9892 1726773060.27502: Set connection var ansible_module_compression to ZIP_DEFLATED 9892 1726773060.27508: Set connection var ansible_shell_executable to /bin/sh 9892 1726773060.27513: Set connection var ansible_pipelining to False 9892 1726773060.27519: Set connection var ansible_connection to ssh 9892 1726773060.27532: variable 'ansible_shell_executable' from source: unknown 9892 1726773060.27536: variable 'ansible_connection' from source: unknown 9892 1726773060.27540: variable 'ansible_module_compression' from source: unknown 9892 1726773060.27543: variable 'ansible_shell_type' from source: unknown 9892 1726773060.27546: variable 'ansible_shell_executable' from source: unknown 9892 1726773060.27550: variable 'ansible_host' from source: host vars for 'managed_node3' 9892 1726773060.27554: variable 'ansible_pipelining' from source: unknown 9892 1726773060.27557: variable 'ansible_timeout' from source: unknown 9892 1726773060.27561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9892 1726773060.27630: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9892 1726773060.27640: variable 'omit' from source: magic vars 9892 1726773060.27645: starting attempt loop 9892 1726773060.27649: running the handler 9892 1726773060.27655: _low_level_execute_command(): starting 9892 1726773060.27659: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9892 1726773060.29952: stdout chunk (state=2): >>>/root <<< 9892 1726773060.30075: stderr chunk (state=3): >>><<< 9892 1726773060.30082: stdout chunk (state=3): >>><<< 9892 1726773060.30099: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9892 1726773060.30109: _low_level_execute_command(): starting 9892 1726773060.30115: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773060.3010612-9892-11449961239942 `" && echo ansible-tmp-1726773060.3010612-9892-11449961239942="` echo /root/.ansible/tmp/ansible-tmp-1726773060.3010612-9892-11449961239942 `" ) && sleep 0' 9892 1726773060.32648: stdout chunk (state=2): >>>ansible-tmp-1726773060.3010612-9892-11449961239942=/root/.ansible/tmp/ansible-tmp-1726773060.3010612-9892-11449961239942 <<< 9892 1726773060.32786: stderr chunk (state=3): >>><<< 9892 1726773060.32795: stdout chunk (state=3): >>><<< 9892 1726773060.32809: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773060.3010612-9892-11449961239942=/root/.ansible/tmp/ansible-tmp-1726773060.3010612-9892-11449961239942 , stderr= 9892 1726773060.32839: variable 'ansible_module_compression' from source: unknown 9892 1726773060.32875: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 9892 1726773060.32894: variable 'ansible_facts' from source: unknown 9892 1726773060.32950: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773060.3010612-9892-11449961239942/AnsiballZ_stat.py 9892 1726773060.33046: Sending initial data 9892 1726773060.33053: Sent initial data (150 bytes) 9892 1726773060.35905: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpj3tfjmob /root/.ansible/tmp/ansible-tmp-1726773060.3010612-9892-11449961239942/AnsiballZ_stat.py <<< 9892 1726773060.37237: stderr chunk (state=3): >>><<< 9892 1726773060.37250: stdout chunk (state=3): >>><<< 9892 1726773060.37272: done transferring module to remote 9892 1726773060.37283: _low_level_execute_command(): starting 9892 1726773060.37290: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773060.3010612-9892-11449961239942/ /root/.ansible/tmp/ansible-tmp-1726773060.3010612-9892-11449961239942/AnsiballZ_stat.py && sleep 0' 9892 1726773060.39766: stderr chunk (state=2): >>><<< 9892 1726773060.39778: stdout chunk (state=2): >>><<< 9892 1726773060.39796: _low_level_execute_command() done: rc=0, stdout=, stderr= 9892 1726773060.39800: _low_level_execute_command(): starting 9892 1726773060.39806: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773060.3010612-9892-11449961239942/AnsiballZ_stat.py && sleep 0' 9892 1726773060.55930: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 136, "inode": 917919, "dev": 51713, "nlink": 3, "atime": 1726773038.2860043, "mtime": 1726773050.952053, "ctime": 1726773050.952053, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 9892 1726773060.57033: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 9892 1726773060.57045: stdout chunk (state=3): >>><<< 9892 1726773060.57060: stderr chunk (state=3): >>><<< 9892 1726773060.57076: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 136, "inode": 917919, "dev": 51713, "nlink": 3, "atime": 1726773038.2860043, "mtime": 1726773050.952053, "ctime": 1726773050.952053, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.47.99 closed. 9892 1726773060.57126: done with _execute_module (stat, {'path': '/etc/tuned', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773060.3010612-9892-11449961239942/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9892 1726773060.57135: _low_level_execute_command(): starting 9892 1726773060.57143: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773060.3010612-9892-11449961239942/ > /dev/null 2>&1 && sleep 0' 9892 1726773060.59683: stderr chunk (state=2): >>><<< 9892 1726773060.59694: stdout chunk (state=2): >>><<< 9892 1726773060.59708: _low_level_execute_command() done: rc=0, stdout=, stderr= 9892 1726773060.59715: handler run complete 9892 1726773060.59744: attempt loop complete, returning result 9892 1726773060.59759: variable 'item' from source: unknown 9892 1726773060.59819: variable 'item' from source: unknown ok: [managed_node3] => (item=/etc/tuned) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned", "stat": { "atime": 1726773038.2860043, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1726773050.952053, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 917919, "isblk": false, "ischr": false, "isdir": true, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/directory", "mode": "0755", "mtime": 1726773050.952053, "nlink": 3, "path": "/etc/tuned", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 136, "uid": 0, "version": "1785990601", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 9892 1726773060.59854: dumping result to json 9892 1726773060.59861: done dumping result, returning 9892 1726773060.59867: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory [0affffe7-6841-7dd6-8fa6-000000000050] 9892 1726773060.59872: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000050 9892 1726773060.59902: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000050 9892 1726773060.59905: WORKER PROCESS EXITING 9733 1726773060.60324: no more pending results, returning what we have 9733 1726773060.60326: results queue empty 9733 1726773060.60326: checking for any_errors_fatal 9733 1726773060.60329: done checking for any_errors_fatal 9733 1726773060.60330: checking for max_fail_percentage 9733 1726773060.60331: done checking for max_fail_percentage 9733 1726773060.60331: checking to see if all hosts have failed and the running result is not ok 9733 1726773060.60331: done checking to see if all hosts have failed 9733 1726773060.60332: getting the remaining hosts for this loop 9733 1726773060.60333: done getting the remaining hosts for this loop 9733 1726773060.60335: getting the next task for host managed_node3 9733 1726773060.60338: done getting next task for host managed_node3 9733 1726773060.60341: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 9733 1726773060.60342: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773060.60348: getting variables 9733 1726773060.60349: in VariableManager get_vars() 9733 1726773060.60367: Calling all_inventory to load vars for managed_node3 9733 1726773060.60370: Calling groups_inventory to load vars for managed_node3 9733 1726773060.60371: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773060.60378: Calling all_plugins_play to load vars for managed_node3 9733 1726773060.60380: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773060.60381: Calling groups_plugins_play to load vars for managed_node3 9733 1726773060.60476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773060.60589: done with get_vars() 9733 1726773060.60597: done getting variables 9733 1726773060.60639: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:63 Thursday 19 September 2024 15:11:00 -0400 (0:00:00.697) 0:00:06.339 **** 9733 1726773060.60659: entering _queue_task() for managed_node3/set_fact 9733 1726773060.60832: worker is 1 (out of 1 available) 9733 1726773060.60845: exiting _queue_task() for managed_node3/set_fact 9733 1726773060.60858: done queuing things up, now waiting for results queue to drain 9733 1726773060.60860: waiting for pending results... 9921 1726773060.60977: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 9921 1726773060.61082: in run() - task 0affffe7-6841-7dd6-8fa6-000000000051 9921 1726773060.61102: variable 'ansible_search_path' from source: unknown 9921 1726773060.61106: variable 'ansible_search_path' from source: unknown 9921 1726773060.61134: calling self._execute() 9921 1726773060.61194: variable 'ansible_host' from source: host vars for 'managed_node3' 9921 1726773060.61204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9921 1726773060.61212: variable 'omit' from source: magic vars 9921 1726773060.61284: variable 'omit' from source: magic vars 9921 1726773060.61326: variable 'omit' from source: magic vars 9921 1726773060.61771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9921 1726773060.63382: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9921 1726773060.63439: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9921 1726773060.63467: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9921 1726773060.63498: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9921 1726773060.63518: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9921 1726773060.63578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9921 1726773060.63601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9921 1726773060.63619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9921 1726773060.63648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9921 1726773060.63660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9921 1726773060.63697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9921 1726773060.63713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9921 1726773060.63730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9921 1726773060.63758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9921 1726773060.63774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9921 1726773060.63814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9921 1726773060.63832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9921 1726773060.63849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9921 1726773060.63879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9921 1726773060.63892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9921 1726773060.64044: variable '__kernel_settings_find_profile_dirs' from source: set_fact 9921 1726773060.64113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9921 1726773060.64222: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9921 1726773060.64248: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9921 1726773060.64272: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9921 1726773060.64297: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9921 1726773060.64327: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9921 1726773060.64543: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9921 1726773060.64561: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9921 1726773060.64582: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9921 1726773060.64622: variable 'omit' from source: magic vars 9921 1726773060.64642: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9921 1726773060.64661: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9921 1726773060.64678: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9921 1726773060.64693: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9921 1726773060.64702: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9921 1726773060.64726: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9921 1726773060.64732: variable 'ansible_host' from source: host vars for 'managed_node3' 9921 1726773060.64736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9921 1726773060.64807: Set connection var ansible_timeout to 10 9921 1726773060.64812: Set connection var ansible_shell_type to sh 9921 1726773060.64818: Set connection var ansible_module_compression to ZIP_DEFLATED 9921 1726773060.64824: Set connection var ansible_shell_executable to /bin/sh 9921 1726773060.64829: Set connection var ansible_pipelining to False 9921 1726773060.64836: Set connection var ansible_connection to ssh 9921 1726773060.64855: variable 'ansible_shell_executable' from source: unknown 9921 1726773060.64859: variable 'ansible_connection' from source: unknown 9921 1726773060.64862: variable 'ansible_module_compression' from source: unknown 9921 1726773060.64866: variable 'ansible_shell_type' from source: unknown 9921 1726773060.64871: variable 'ansible_shell_executable' from source: unknown 9921 1726773060.64875: variable 'ansible_host' from source: host vars for 'managed_node3' 9921 1726773060.64879: variable 'ansible_pipelining' from source: unknown 9921 1726773060.64883: variable 'ansible_timeout' from source: unknown 9921 1726773060.64889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9921 1726773060.64951: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9921 1726773060.64963: variable 'omit' from source: magic vars 9921 1726773060.64972: starting attempt loop 9921 1726773060.64975: running the handler 9921 1726773060.64987: handler run complete 9921 1726773060.64995: attempt loop complete, returning result 9921 1726773060.64998: _execute() done 9921 1726773060.65001: dumping result to json 9921 1726773060.65004: done dumping result, returning 9921 1726773060.65011: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir [0affffe7-6841-7dd6-8fa6-000000000051] 9921 1726773060.65015: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000051 9921 1726773060.65032: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000051 9921 1726773060.65034: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__kernel_settings_profile_parent": "/etc/tuned" }, "changed": false } 9733 1726773060.65218: no more pending results, returning what we have 9733 1726773060.65222: results queue empty 9733 1726773060.65223: checking for any_errors_fatal 9733 1726773060.65229: done checking for any_errors_fatal 9733 1726773060.65230: checking for max_fail_percentage 9733 1726773060.65231: done checking for max_fail_percentage 9733 1726773060.65232: checking to see if all hosts have failed and the running result is not ok 9733 1726773060.65232: done checking to see if all hosts have failed 9733 1726773060.65233: getting the remaining hosts for this loop 9733 1726773060.65234: done getting the remaining hosts for this loop 9733 1726773060.65237: getting the next task for host managed_node3 9733 1726773060.65242: done getting next task for host managed_node3 9733 1726773060.65245: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 9733 1726773060.65247: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773060.65256: getting variables 9733 1726773060.65257: in VariableManager get_vars() 9733 1726773060.65287: Calling all_inventory to load vars for managed_node3 9733 1726773060.65289: Calling groups_inventory to load vars for managed_node3 9733 1726773060.65291: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773060.65300: Calling all_plugins_play to load vars for managed_node3 9733 1726773060.65302: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773060.65304: Calling groups_plugins_play to load vars for managed_node3 9733 1726773060.65410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773060.65549: done with get_vars() 9733 1726773060.65556: done getting variables 9733 1726773060.65624: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 Thursday 19 September 2024 15:11:00 -0400 (0:00:00.049) 0:00:06.388 **** 9733 1726773060.65645: entering _queue_task() for managed_node3/service 9733 1726773060.65646: Creating lock for service 9733 1726773060.65827: worker is 1 (out of 1 available) 9733 1726773060.65843: exiting _queue_task() for managed_node3/service 9733 1726773060.65855: done queuing things up, now waiting for results queue to drain 9733 1726773060.65857: waiting for pending results... 9923 1726773060.65978: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 9923 1726773060.66082: in run() - task 0affffe7-6841-7dd6-8fa6-000000000052 9923 1726773060.66099: variable 'ansible_search_path' from source: unknown 9923 1726773060.66103: variable 'ansible_search_path' from source: unknown 9923 1726773060.66138: variable '__kernel_settings_services' from source: include_vars 9923 1726773060.66367: variable '__kernel_settings_services' from source: include_vars 9923 1726773060.66434: variable 'omit' from source: magic vars 9923 1726773060.66516: variable 'ansible_host' from source: host vars for 'managed_node3' 9923 1726773060.66527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9923 1726773060.66535: variable 'omit' from source: magic vars 9923 1726773060.66594: variable 'omit' from source: magic vars 9923 1726773060.66627: variable 'omit' from source: magic vars 9923 1726773060.66660: variable 'item' from source: unknown 9923 1726773060.66724: variable 'item' from source: unknown 9923 1726773060.66746: variable 'omit' from source: magic vars 9923 1726773060.66779: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9923 1726773060.66806: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9923 1726773060.66825: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9923 1726773060.66838: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9923 1726773060.66849: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9923 1726773060.66873: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9923 1726773060.66878: variable 'ansible_host' from source: host vars for 'managed_node3' 9923 1726773060.66883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9923 1726773060.66949: Set connection var ansible_timeout to 10 9923 1726773060.66953: Set connection var ansible_shell_type to sh 9923 1726773060.66957: Set connection var ansible_module_compression to ZIP_DEFLATED 9923 1726773060.66960: Set connection var ansible_shell_executable to /bin/sh 9923 1726773060.66963: Set connection var ansible_pipelining to False 9923 1726773060.66967: Set connection var ansible_connection to ssh 9923 1726773060.66978: variable 'ansible_shell_executable' from source: unknown 9923 1726773060.66982: variable 'ansible_connection' from source: unknown 9923 1726773060.66986: variable 'ansible_module_compression' from source: unknown 9923 1726773060.66988: variable 'ansible_shell_type' from source: unknown 9923 1726773060.66990: variable 'ansible_shell_executable' from source: unknown 9923 1726773060.66992: variable 'ansible_host' from source: host vars for 'managed_node3' 9923 1726773060.66995: variable 'ansible_pipelining' from source: unknown 9923 1726773060.66997: variable 'ansible_timeout' from source: unknown 9923 1726773060.66999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9923 1726773060.67089: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9923 1726773060.67100: variable 'omit' from source: magic vars 9923 1726773060.67106: starting attempt loop 9923 1726773060.67108: running the handler 9923 1726773060.67165: variable 'ansible_facts' from source: unknown 9923 1726773060.67247: _low_level_execute_command(): starting 9923 1726773060.67254: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9923 1726773060.69675: stdout chunk (state=2): >>>/root <<< 9923 1726773060.69793: stderr chunk (state=3): >>><<< 9923 1726773060.69800: stdout chunk (state=3): >>><<< 9923 1726773060.69819: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9923 1726773060.69832: _low_level_execute_command(): starting 9923 1726773060.69838: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773060.698266-9923-215810459491053 `" && echo ansible-tmp-1726773060.698266-9923-215810459491053="` echo /root/.ansible/tmp/ansible-tmp-1726773060.698266-9923-215810459491053 `" ) && sleep 0' 9923 1726773060.72396: stdout chunk (state=2): >>>ansible-tmp-1726773060.698266-9923-215810459491053=/root/.ansible/tmp/ansible-tmp-1726773060.698266-9923-215810459491053 <<< 9923 1726773060.72528: stderr chunk (state=3): >>><<< 9923 1726773060.72535: stdout chunk (state=3): >>><<< 9923 1726773060.72550: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773060.698266-9923-215810459491053=/root/.ansible/tmp/ansible-tmp-1726773060.698266-9923-215810459491053 , stderr= 9923 1726773060.72578: variable 'ansible_module_compression' from source: unknown 9923 1726773060.72625: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 9923 1726773060.72631: ANSIBALLZ: Acquiring lock 9923 1726773060.72635: ANSIBALLZ: Lock acquired: 139792132305312 9923 1726773060.72639: ANSIBALLZ: Creating module 9923 1726773060.97041: ANSIBALLZ: Writing module into payload 9923 1726773060.97192: ANSIBALLZ: Writing module 9923 1726773060.97218: ANSIBALLZ: Renaming module 9923 1726773060.97225: ANSIBALLZ: Done creating module 9923 1726773060.97256: variable 'ansible_facts' from source: unknown 9923 1726773060.97418: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773060.698266-9923-215810459491053/AnsiballZ_systemd.py 9923 1726773060.97526: Sending initial data 9923 1726773060.97533: Sent initial data (153 bytes) 9923 1726773061.00179: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpr53qesrt /root/.ansible/tmp/ansible-tmp-1726773060.698266-9923-215810459491053/AnsiballZ_systemd.py <<< 9923 1726773061.02211: stderr chunk (state=3): >>><<< 9923 1726773061.02221: stdout chunk (state=3): >>><<< 9923 1726773061.02244: done transferring module to remote 9923 1726773061.02256: _low_level_execute_command(): starting 9923 1726773061.02261: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773060.698266-9923-215810459491053/ /root/.ansible/tmp/ansible-tmp-1726773060.698266-9923-215810459491053/AnsiballZ_systemd.py && sleep 0' 9923 1726773061.04723: stderr chunk (state=2): >>><<< 9923 1726773061.04735: stdout chunk (state=2): >>><<< 9923 1726773061.04753: _low_level_execute_command() done: rc=0, stdout=, stderr= 9923 1726773061.04758: _low_level_execute_command(): starting 9923 1726773061.04763: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773060.698266-9923-215810459491053/AnsiballZ_systemd.py && sleep 0' 9923 1726773061.32777: stdout chunk (state=2): >>> {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:38 EDT", "WatchdogTimestampMonotonic": "453344532", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "9802", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ExecMainStartTimestampMonotonic": "453204995", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "9802", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:38 EDT] ; stop_time=[n/a] ; pid=9802 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "17055744", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryH<<< 9923 1726773061.32815: stdout chunk (state=3): >>>igh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "power-profiles-daemon.service auto-cpufreq.service tlp.service cpupower.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "systemd-journald.socket polkit.service systemd-sysctl.service dbus.socket basic.target system.slice network.target dbus.service sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:38 EDT", "Stat<<< 9923 1726773061.32830: stdout chunk (state=3): >>>eChangeTimestampMonotonic": "453344536", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:38 EDT", "InactiveExitTimestampMonotonic": "453205057", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ActiveEnterTimestampMonotonic": "453344536", "ActiveExitTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ActiveExitTimestampMonotonic": "453097312", "InactiveEnterTimestamp": "Thu 2024-09-19 15:10:38 EDT", "InactiveEnterTimestampMonotonic": "453201635", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ConditionTimestampMonotonic": "453202685", "AssertTimestamp": "Thu 2024-09-19 15:10:38 EDT", "AssertTimestampMonotonic": "453202686", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "29d42365ee9e42d1916b9ebf15b9284e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 9923 1726773061.34457: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 9923 1726773061.34508: stderr chunk (state=3): >>><<< 9923 1726773061.34516: stdout chunk (state=3): >>><<< 9923 1726773061.34535: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:38 EDT", "WatchdogTimestampMonotonic": "453344532", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "9802", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ExecMainStartTimestampMonotonic": "453204995", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "9802", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:38 EDT] ; stop_time=[n/a] ; pid=9802 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "17055744", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "power-profiles-daemon.service auto-cpufreq.service tlp.service cpupower.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "systemd-journald.socket polkit.service systemd-sysctl.service dbus.socket basic.target system.slice network.target dbus.service sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:38 EDT", "StateChangeTimestampMonotonic": "453344536", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:38 EDT", "InactiveExitTimestampMonotonic": "453205057", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ActiveEnterTimestampMonotonic": "453344536", "ActiveExitTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ActiveExitTimestampMonotonic": "453097312", "InactiveEnterTimestamp": "Thu 2024-09-19 15:10:38 EDT", "InactiveEnterTimestampMonotonic": "453201635", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ConditionTimestampMonotonic": "453202685", "AssertTimestamp": "Thu 2024-09-19 15:10:38 EDT", "AssertTimestampMonotonic": "453202686", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "29d42365ee9e42d1916b9ebf15b9284e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.47.99 closed. 9923 1726773061.34639: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773060.698266-9923-215810459491053/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9923 1726773061.34658: _low_level_execute_command(): starting 9923 1726773061.34665: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773060.698266-9923-215810459491053/ > /dev/null 2>&1 && sleep 0' 9923 1726773061.37140: stderr chunk (state=2): >>><<< 9923 1726773061.37151: stdout chunk (state=2): >>><<< 9923 1726773061.37167: _low_level_execute_command() done: rc=0, stdout=, stderr= 9923 1726773061.37176: handler run complete 9923 1726773061.37211: attempt loop complete, returning result 9923 1726773061.37229: variable 'item' from source: unknown 9923 1726773061.37293: variable 'item' from source: unknown ok: [managed_node3] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ActiveEnterTimestampMonotonic": "453344536", "ActiveExitTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ActiveExitTimestampMonotonic": "453097312", "ActiveState": "active", "After": "systemd-journald.socket polkit.service systemd-sysctl.service dbus.socket basic.target system.slice network.target dbus.service sysinit.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 15:10:38 EDT", "AssertTimestampMonotonic": "453202686", "Before": "shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ConditionTimestampMonotonic": "453202685", "ConfigurationDirectoryMode": "0755", "Conflicts": "power-profiles-daemon.service auto-cpufreq.service tlp.service cpupower.service shutdown.target", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "9802", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ExecMainStartTimestampMonotonic": "453204995", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:38 EDT] ; stop_time=[n/a] ; pid=9802 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestamp": "Thu 2024-09-19 15:10:38 EDT", "InactiveEnterTimestampMonotonic": "453201635", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:38 EDT", "InactiveExitTimestampMonotonic": "453205057", "InvocationID": "29d42365ee9e42d1916b9ebf15b9284e", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "9802", "MemoryAccounting": "yes", "MemoryCurrent": "17055744", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 15:10:38 EDT", "StateChangeTimestampMonotonic": "453344536", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 15:10:38 EDT", "WatchdogTimestampMonotonic": "453344532", "WatchdogUSec": "0" } } 9923 1726773061.37399: dumping result to json 9923 1726773061.37418: done dumping result, returning 9923 1726773061.37427: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started [0affffe7-6841-7dd6-8fa6-000000000052] 9923 1726773061.37432: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000052 9923 1726773061.37537: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000052 9923 1726773061.37542: WORKER PROCESS EXITING 9733 1726773061.37879: no more pending results, returning what we have 9733 1726773061.37881: results queue empty 9733 1726773061.37881: checking for any_errors_fatal 9733 1726773061.37884: done checking for any_errors_fatal 9733 1726773061.37886: checking for max_fail_percentage 9733 1726773061.37888: done checking for max_fail_percentage 9733 1726773061.37888: checking to see if all hosts have failed and the running result is not ok 9733 1726773061.37889: done checking to see if all hosts have failed 9733 1726773061.37889: getting the remaining hosts for this loop 9733 1726773061.37890: done getting the remaining hosts for this loop 9733 1726773061.37892: getting the next task for host managed_node3 9733 1726773061.37898: done getting next task for host managed_node3 9733 1726773061.37900: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 9733 1726773061.37902: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773061.37908: getting variables 9733 1726773061.37909: in VariableManager get_vars() 9733 1726773061.37936: Calling all_inventory to load vars for managed_node3 9733 1726773061.37938: Calling groups_inventory to load vars for managed_node3 9733 1726773061.37939: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773061.37946: Calling all_plugins_play to load vars for managed_node3 9733 1726773061.37948: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773061.37949: Calling groups_plugins_play to load vars for managed_node3 9733 1726773061.38053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773061.38171: done with get_vars() 9733 1726773061.38178: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74 Thursday 19 September 2024 15:11:01 -0400 (0:00:00.725) 0:00:07.114 **** 9733 1726773061.38244: entering _queue_task() for managed_node3/file 9733 1726773061.38426: worker is 1 (out of 1 available) 9733 1726773061.38441: exiting _queue_task() for managed_node3/file 9733 1726773061.38454: done queuing things up, now waiting for results queue to drain 9733 1726773061.38456: waiting for pending results... 9946 1726773061.38570: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 9946 1726773061.38674: in run() - task 0affffe7-6841-7dd6-8fa6-000000000053 9946 1726773061.38690: variable 'ansible_search_path' from source: unknown 9946 1726773061.38694: variable 'ansible_search_path' from source: unknown 9946 1726773061.38720: calling self._execute() 9946 1726773061.38777: variable 'ansible_host' from source: host vars for 'managed_node3' 9946 1726773061.38784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9946 1726773061.38792: variable 'omit' from source: magic vars 9946 1726773061.38860: variable 'omit' from source: magic vars 9946 1726773061.38895: variable 'omit' from source: magic vars 9946 1726773061.38924: variable '__kernel_settings_profile_dir' from source: role '' all vars 9946 1726773061.39148: variable '__kernel_settings_profile_dir' from source: role '' all vars 9946 1726773061.39216: variable '__kernel_settings_profile_parent' from source: set_fact 9946 1726773061.39226: variable '__kernel_settings_tuned_profile' from source: role '' all vars 9946 1726773061.39260: variable 'omit' from source: magic vars 9946 1726773061.39295: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9946 1726773061.39324: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9946 1726773061.39340: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9946 1726773061.39352: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9946 1726773061.39362: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9946 1726773061.39384: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9946 1726773061.39401: variable 'ansible_host' from source: host vars for 'managed_node3' 9946 1726773061.39406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9946 1726773061.39477: Set connection var ansible_timeout to 10 9946 1726773061.39482: Set connection var ansible_shell_type to sh 9946 1726773061.39490: Set connection var ansible_module_compression to ZIP_DEFLATED 9946 1726773061.39496: Set connection var ansible_shell_executable to /bin/sh 9946 1726773061.39502: Set connection var ansible_pipelining to False 9946 1726773061.39508: Set connection var ansible_connection to ssh 9946 1726773061.39523: variable 'ansible_shell_executable' from source: unknown 9946 1726773061.39527: variable 'ansible_connection' from source: unknown 9946 1726773061.39531: variable 'ansible_module_compression' from source: unknown 9946 1726773061.39534: variable 'ansible_shell_type' from source: unknown 9946 1726773061.39537: variable 'ansible_shell_executable' from source: unknown 9946 1726773061.39541: variable 'ansible_host' from source: host vars for 'managed_node3' 9946 1726773061.39545: variable 'ansible_pipelining' from source: unknown 9946 1726773061.39549: variable 'ansible_timeout' from source: unknown 9946 1726773061.39553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9946 1726773061.39693: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9946 1726773061.39704: variable 'omit' from source: magic vars 9946 1726773061.39711: starting attempt loop 9946 1726773061.39714: running the handler 9946 1726773061.39725: _low_level_execute_command(): starting 9946 1726773061.39732: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9946 1726773061.42205: stdout chunk (state=2): >>>/root <<< 9946 1726773061.42329: stderr chunk (state=3): >>><<< 9946 1726773061.42338: stdout chunk (state=3): >>><<< 9946 1726773061.42360: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9946 1726773061.42375: _low_level_execute_command(): starting 9946 1726773061.42382: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773061.4237049-9946-81410740216390 `" && echo ansible-tmp-1726773061.4237049-9946-81410740216390="` echo /root/.ansible/tmp/ansible-tmp-1726773061.4237049-9946-81410740216390 `" ) && sleep 0' 9946 1726773061.45292: stdout chunk (state=2): >>>ansible-tmp-1726773061.4237049-9946-81410740216390=/root/.ansible/tmp/ansible-tmp-1726773061.4237049-9946-81410740216390 <<< 9946 1726773061.45306: stderr chunk (state=2): >>><<< 9946 1726773061.45321: stdout chunk (state=3): >>><<< 9946 1726773061.45335: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773061.4237049-9946-81410740216390=/root/.ansible/tmp/ansible-tmp-1726773061.4237049-9946-81410740216390 , stderr= 9946 1726773061.45387: variable 'ansible_module_compression' from source: unknown 9946 1726773061.45447: ANSIBALLZ: Using lock for file 9946 1726773061.45453: ANSIBALLZ: Acquiring lock 9946 1726773061.45456: ANSIBALLZ: Lock acquired: 139792132688640 9946 1726773061.45460: ANSIBALLZ: Creating module 9946 1726773061.56671: ANSIBALLZ: Writing module into payload 9946 1726773061.56880: ANSIBALLZ: Writing module 9946 1726773061.56905: ANSIBALLZ: Renaming module 9946 1726773061.56913: ANSIBALLZ: Done creating module 9946 1726773061.56930: variable 'ansible_facts' from source: unknown 9946 1726773061.57021: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773061.4237049-9946-81410740216390/AnsiballZ_file.py 9946 1726773061.57299: Sending initial data 9946 1726773061.57306: Sent initial data (150 bytes) 9946 1726773061.59835: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmp86987u8a /root/.ansible/tmp/ansible-tmp-1726773061.4237049-9946-81410740216390/AnsiballZ_file.py <<< 9946 1726773061.61045: stderr chunk (state=3): >>><<< 9946 1726773061.61056: stdout chunk (state=3): >>><<< 9946 1726773061.61076: done transferring module to remote 9946 1726773061.61088: _low_level_execute_command(): starting 9946 1726773061.61094: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773061.4237049-9946-81410740216390/ /root/.ansible/tmp/ansible-tmp-1726773061.4237049-9946-81410740216390/AnsiballZ_file.py && sleep 0' 9946 1726773061.63574: stderr chunk (state=2): >>><<< 9946 1726773061.63587: stdout chunk (state=2): >>><<< 9946 1726773061.63603: _low_level_execute_command() done: rc=0, stdout=, stderr= 9946 1726773061.63608: _low_level_execute_command(): starting 9946 1726773061.63614: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773061.4237049-9946-81410740216390/AnsiballZ_file.py && sleep 0' 9946 1726773061.80019: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "absent"}, "after": {"path": "/etc/tuned/kernel_settings", "state": "directory"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 6, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 9946 1726773061.81488: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 9946 1726773061.81500: stdout chunk (state=3): >>><<< 9946 1726773061.81515: stderr chunk (state=3): >>><<< 9946 1726773061.81527: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "absent"}, "after": {"path": "/etc/tuned/kernel_settings", "state": "directory"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 6, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.47.99 closed. 9946 1726773061.81557: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'directory', 'mode': '0755', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773061.4237049-9946-81410740216390/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9946 1726773061.81565: _low_level_execute_command(): starting 9946 1726773061.81571: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773061.4237049-9946-81410740216390/ > /dev/null 2>&1 && sleep 0' 9946 1726773061.84070: stderr chunk (state=2): >>><<< 9946 1726773061.84081: stdout chunk (state=2): >>><<< 9946 1726773061.84100: _low_level_execute_command() done: rc=0, stdout=, stderr= 9946 1726773061.84107: handler run complete 9946 1726773061.84135: attempt loop complete, returning result 9946 1726773061.84143: _execute() done 9946 1726773061.84148: dumping result to json 9946 1726773061.84154: done dumping result, returning 9946 1726773061.84161: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists [0affffe7-6841-7dd6-8fa6-000000000053] 9946 1726773061.84166: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000053 9946 1726773061.84203: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000053 9946 1726773061.84207: WORKER PROCESS EXITING changed: [managed_node3] => { "changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/tuned/kernel_settings", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 6, "state": "directory", "uid": 0 } 9733 1726773061.84391: no more pending results, returning what we have 9733 1726773061.84393: results queue empty 9733 1726773061.84394: checking for any_errors_fatal 9733 1726773061.84407: done checking for any_errors_fatal 9733 1726773061.84407: checking for max_fail_percentage 9733 1726773061.84409: done checking for max_fail_percentage 9733 1726773061.84409: checking to see if all hosts have failed and the running result is not ok 9733 1726773061.84410: done checking to see if all hosts have failed 9733 1726773061.84410: getting the remaining hosts for this loop 9733 1726773061.84411: done getting the remaining hosts for this loop 9733 1726773061.84414: getting the next task for host managed_node3 9733 1726773061.84419: done getting next task for host managed_node3 9733 1726773061.84422: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 9733 1726773061.84424: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773061.84433: getting variables 9733 1726773061.84434: in VariableManager get_vars() 9733 1726773061.84464: Calling all_inventory to load vars for managed_node3 9733 1726773061.84467: Calling groups_inventory to load vars for managed_node3 9733 1726773061.84471: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773061.84480: Calling all_plugins_play to load vars for managed_node3 9733 1726773061.84482: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773061.84484: Calling groups_plugins_play to load vars for managed_node3 9733 1726773061.84609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773061.84727: done with get_vars() 9733 1726773061.84735: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get active_profile] ********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:80 Thursday 19 September 2024 15:11:01 -0400 (0:00:00.465) 0:00:07.580 **** 9733 1726773061.84805: entering _queue_task() for managed_node3/slurp 9733 1726773061.84989: worker is 1 (out of 1 available) 9733 1726773061.85002: exiting _queue_task() for managed_node3/slurp 9733 1726773061.85013: done queuing things up, now waiting for results queue to drain 9733 1726773061.85016: waiting for pending results... 9966 1726773061.85128: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 9966 1726773061.85235: in run() - task 0affffe7-6841-7dd6-8fa6-000000000054 9966 1726773061.85253: variable 'ansible_search_path' from source: unknown 9966 1726773061.85257: variable 'ansible_search_path' from source: unknown 9966 1726773061.85288: calling self._execute() 9966 1726773061.85347: variable 'ansible_host' from source: host vars for 'managed_node3' 9966 1726773061.85356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9966 1726773061.85364: variable 'omit' from source: magic vars 9966 1726773061.85440: variable 'omit' from source: magic vars 9966 1726773061.85475: variable 'omit' from source: magic vars 9966 1726773061.85498: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 9966 1726773061.85712: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 9966 1726773061.85771: variable '__kernel_settings_tuned_dir' from source: role '' all vars 9966 1726773061.85802: variable 'omit' from source: magic vars 9966 1726773061.85834: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9966 1726773061.85862: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9966 1726773061.85881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9966 1726773061.85897: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9966 1726773061.85908: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9966 1726773061.85931: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9966 1726773061.85935: variable 'ansible_host' from source: host vars for 'managed_node3' 9966 1726773061.85941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9966 1726773061.86016: Set connection var ansible_timeout to 10 9966 1726773061.86021: Set connection var ansible_shell_type to sh 9966 1726773061.86026: Set connection var ansible_module_compression to ZIP_DEFLATED 9966 1726773061.86032: Set connection var ansible_shell_executable to /bin/sh 9966 1726773061.86037: Set connection var ansible_pipelining to False 9966 1726773061.86044: Set connection var ansible_connection to ssh 9966 1726773061.86061: variable 'ansible_shell_executable' from source: unknown 9966 1726773061.86067: variable 'ansible_connection' from source: unknown 9966 1726773061.86072: variable 'ansible_module_compression' from source: unknown 9966 1726773061.86074: variable 'ansible_shell_type' from source: unknown 9966 1726773061.86075: variable 'ansible_shell_executable' from source: unknown 9966 1726773061.86077: variable 'ansible_host' from source: host vars for 'managed_node3' 9966 1726773061.86079: variable 'ansible_pipelining' from source: unknown 9966 1726773061.86081: variable 'ansible_timeout' from source: unknown 9966 1726773061.86083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9966 1726773061.86243: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9966 1726773061.86260: variable 'omit' from source: magic vars 9966 1726773061.86268: starting attempt loop 9966 1726773061.86272: running the handler 9966 1726773061.86284: _low_level_execute_command(): starting 9966 1726773061.86293: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9966 1726773061.89451: stdout chunk (state=2): >>>/root <<< 9966 1726773061.89465: stderr chunk (state=2): >>><<< 9966 1726773061.89483: stdout chunk (state=3): >>><<< 9966 1726773061.89501: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9966 1726773061.89516: _low_level_execute_command(): starting 9966 1726773061.89523: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773061.8951015-9966-207236940100363 `" && echo ansible-tmp-1726773061.8951015-9966-207236940100363="` echo /root/.ansible/tmp/ansible-tmp-1726773061.8951015-9966-207236940100363 `" ) && sleep 0' 9966 1726773061.92740: stdout chunk (state=2): >>>ansible-tmp-1726773061.8951015-9966-207236940100363=/root/.ansible/tmp/ansible-tmp-1726773061.8951015-9966-207236940100363 <<< 9966 1726773061.92962: stderr chunk (state=3): >>><<< 9966 1726773061.92975: stdout chunk (state=3): >>><<< 9966 1726773061.92995: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773061.8951015-9966-207236940100363=/root/.ansible/tmp/ansible-tmp-1726773061.8951015-9966-207236940100363 , stderr= 9966 1726773061.93037: variable 'ansible_module_compression' from source: unknown 9966 1726773061.93080: ANSIBALLZ: Using lock for slurp 9966 1726773061.93087: ANSIBALLZ: Acquiring lock 9966 1726773061.93091: ANSIBALLZ: Lock acquired: 139792132689360 9966 1726773061.93095: ANSIBALLZ: Creating module 9966 1726773062.03166: ANSIBALLZ: Writing module into payload 9966 1726773062.03222: ANSIBALLZ: Writing module 9966 1726773062.03243: ANSIBALLZ: Renaming module 9966 1726773062.03249: ANSIBALLZ: Done creating module 9966 1726773062.03265: variable 'ansible_facts' from source: unknown 9966 1726773062.03325: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773061.8951015-9966-207236940100363/AnsiballZ_slurp.py 9966 1726773062.03429: Sending initial data 9966 1726773062.03436: Sent initial data (152 bytes) 9966 1726773062.06124: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpldich0bf /root/.ansible/tmp/ansible-tmp-1726773061.8951015-9966-207236940100363/AnsiballZ_slurp.py <<< 9966 1726773062.07303: stderr chunk (state=3): >>><<< 9966 1726773062.07313: stdout chunk (state=3): >>><<< 9966 1726773062.07332: done transferring module to remote 9966 1726773062.07343: _low_level_execute_command(): starting 9966 1726773062.07348: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773061.8951015-9966-207236940100363/ /root/.ansible/tmp/ansible-tmp-1726773061.8951015-9966-207236940100363/AnsiballZ_slurp.py && sleep 0' 9966 1726773062.09838: stderr chunk (state=2): >>><<< 9966 1726773062.09849: stdout chunk (state=2): >>><<< 9966 1726773062.09864: _low_level_execute_command() done: rc=0, stdout=, stderr= 9966 1726773062.09868: _low_level_execute_command(): starting 9966 1726773062.09874: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773061.8951015-9966-207236940100363/AnsiballZ_slurp.py && sleep 0' 9966 1726773062.25078: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdAo=", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 9966 1726773062.26122: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 9966 1726773062.26164: stderr chunk (state=3): >>><<< 9966 1726773062.26172: stdout chunk (state=3): >>><<< 9966 1726773062.26190: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdAo=", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.47.99 closed. 9966 1726773062.26214: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773061.8951015-9966-207236940100363/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9966 1726773062.26225: _low_level_execute_command(): starting 9966 1726773062.26230: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773061.8951015-9966-207236940100363/ > /dev/null 2>&1 && sleep 0' 9966 1726773062.28699: stderr chunk (state=2): >>><<< 9966 1726773062.28710: stdout chunk (state=2): >>><<< 9966 1726773062.28724: _low_level_execute_command() done: rc=0, stdout=, stderr= 9966 1726773062.28732: handler run complete 9966 1726773062.28745: attempt loop complete, returning result 9966 1726773062.28749: _execute() done 9966 1726773062.28752: dumping result to json 9966 1726773062.28756: done dumping result, returning 9966 1726773062.28764: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile [0affffe7-6841-7dd6-8fa6-000000000054] 9966 1726773062.28770: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000054 9966 1726773062.28799: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000054 9966 1726773062.28803: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "content": "dmlydHVhbC1ndWVzdAo=", "encoding": "base64", "source": "/etc/tuned/active_profile" } 9733 1726773062.28933: no more pending results, returning what we have 9733 1726773062.28936: results queue empty 9733 1726773062.28936: checking for any_errors_fatal 9733 1726773062.28943: done checking for any_errors_fatal 9733 1726773062.28944: checking for max_fail_percentage 9733 1726773062.28945: done checking for max_fail_percentage 9733 1726773062.28946: checking to see if all hosts have failed and the running result is not ok 9733 1726773062.28946: done checking to see if all hosts have failed 9733 1726773062.28947: getting the remaining hosts for this loop 9733 1726773062.28948: done getting the remaining hosts for this loop 9733 1726773062.28951: getting the next task for host managed_node3 9733 1726773062.28955: done getting next task for host managed_node3 9733 1726773062.28958: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 9733 1726773062.28960: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773062.28972: getting variables 9733 1726773062.28973: in VariableManager get_vars() 9733 1726773062.29006: Calling all_inventory to load vars for managed_node3 9733 1726773062.29008: Calling groups_inventory to load vars for managed_node3 9733 1726773062.29010: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773062.29020: Calling all_plugins_play to load vars for managed_node3 9733 1726773062.29022: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773062.29024: Calling groups_plugins_play to load vars for managed_node3 9733 1726773062.29165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773062.29281: done with get_vars() 9733 1726773062.29292: done getting variables 9733 1726773062.29333: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set active_profile] ********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:85 Thursday 19 September 2024 15:11:02 -0400 (0:00:00.445) 0:00:08.026 **** 9733 1726773062.29354: entering _queue_task() for managed_node3/set_fact 9733 1726773062.29526: worker is 1 (out of 1 available) 9733 1726773062.29542: exiting _queue_task() for managed_node3/set_fact 9733 1726773062.29552: done queuing things up, now waiting for results queue to drain 9733 1726773062.29554: waiting for pending results... 9985 1726773062.29668: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 9985 1726773062.29773: in run() - task 0affffe7-6841-7dd6-8fa6-000000000055 9985 1726773062.29793: variable 'ansible_search_path' from source: unknown 9985 1726773062.29797: variable 'ansible_search_path' from source: unknown 9985 1726773062.29825: calling self._execute() 9985 1726773062.29951: variable 'ansible_host' from source: host vars for 'managed_node3' 9985 1726773062.29959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9985 1726773062.29968: variable 'omit' from source: magic vars 9985 1726773062.30042: variable 'omit' from source: magic vars 9985 1726773062.30076: variable 'omit' from source: magic vars 9985 1726773062.30359: variable '__kernel_settings_tuned_profile' from source: role '' all vars 9985 1726773062.30369: variable '__cur_profile' from source: task vars 9985 1726773062.30474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9985 1726773062.32128: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9985 1726773062.32177: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9985 1726773062.32216: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9985 1726773062.32239: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9985 1726773062.32256: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9985 1726773062.32315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9985 1726773062.32333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9985 1726773062.32348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9985 1726773062.32374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9985 1726773062.32384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9985 1726773062.32457: variable '__kernel_settings_tuned_current_profile' from source: set_fact 9985 1726773062.32497: variable 'omit' from source: magic vars 9985 1726773062.32517: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9985 1726773062.32534: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9985 1726773062.32546: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9985 1726773062.32556: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9985 1726773062.32563: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9985 1726773062.32588: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9985 1726773062.32592: variable 'ansible_host' from source: host vars for 'managed_node3' 9985 1726773062.32595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9985 1726773062.32658: Set connection var ansible_timeout to 10 9985 1726773062.32662: Set connection var ansible_shell_type to sh 9985 1726773062.32665: Set connection var ansible_module_compression to ZIP_DEFLATED 9985 1726773062.32671: Set connection var ansible_shell_executable to /bin/sh 9985 1726773062.32674: Set connection var ansible_pipelining to False 9985 1726773062.32678: Set connection var ansible_connection to ssh 9985 1726773062.32694: variable 'ansible_shell_executable' from source: unknown 9985 1726773062.32697: variable 'ansible_connection' from source: unknown 9985 1726773062.32699: variable 'ansible_module_compression' from source: unknown 9985 1726773062.32701: variable 'ansible_shell_type' from source: unknown 9985 1726773062.32702: variable 'ansible_shell_executable' from source: unknown 9985 1726773062.32704: variable 'ansible_host' from source: host vars for 'managed_node3' 9985 1726773062.32707: variable 'ansible_pipelining' from source: unknown 9985 1726773062.32710: variable 'ansible_timeout' from source: unknown 9985 1726773062.32712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9985 1726773062.32774: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9985 1726773062.32784: variable 'omit' from source: magic vars 9985 1726773062.32791: starting attempt loop 9985 1726773062.32793: running the handler 9985 1726773062.32800: handler run complete 9985 1726773062.32806: attempt loop complete, returning result 9985 1726773062.32808: _execute() done 9985 1726773062.32809: dumping result to json 9985 1726773062.32811: done dumping result, returning 9985 1726773062.32816: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile [0affffe7-6841-7dd6-8fa6-000000000055] 9985 1726773062.32820: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000055 9985 1726773062.32838: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000055 9985 1726773062.32840: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__kernel_settings_active_profile": "virtual-guest kernel_settings" }, "changed": false } 9733 1726773062.33141: no more pending results, returning what we have 9733 1726773062.33143: results queue empty 9733 1726773062.33143: checking for any_errors_fatal 9733 1726773062.33146: done checking for any_errors_fatal 9733 1726773062.33146: checking for max_fail_percentage 9733 1726773062.33147: done checking for max_fail_percentage 9733 1726773062.33147: checking to see if all hosts have failed and the running result is not ok 9733 1726773062.33148: done checking to see if all hosts have failed 9733 1726773062.33148: getting the remaining hosts for this loop 9733 1726773062.33149: done getting the remaining hosts for this loop 9733 1726773062.33151: getting the next task for host managed_node3 9733 1726773062.33154: done getting next task for host managed_node3 9733 1726773062.33157: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 9733 1726773062.33158: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773062.33169: getting variables 9733 1726773062.33170: in VariableManager get_vars() 9733 1726773062.33191: Calling all_inventory to load vars for managed_node3 9733 1726773062.33193: Calling groups_inventory to load vars for managed_node3 9733 1726773062.33194: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773062.33200: Calling all_plugins_play to load vars for managed_node3 9733 1726773062.33202: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773062.33203: Calling groups_plugins_play to load vars for managed_node3 9733 1726773062.33298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773062.33411: done with get_vars() 9733 1726773062.33418: done getting variables 9733 1726773062.33503: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 Thursday 19 September 2024 15:11:02 -0400 (0:00:00.041) 0:00:08.067 **** 9733 1726773062.33523: entering _queue_task() for managed_node3/copy 9733 1726773062.33695: worker is 1 (out of 1 available) 9733 1726773062.33711: exiting _queue_task() for managed_node3/copy 9733 1726773062.33722: done queuing things up, now waiting for results queue to drain 9733 1726773062.33723: waiting for pending results... 9986 1726773062.33842: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 9986 1726773062.33949: in run() - task 0affffe7-6841-7dd6-8fa6-000000000056 9986 1726773062.33964: variable 'ansible_search_path' from source: unknown 9986 1726773062.33968: variable 'ansible_search_path' from source: unknown 9986 1726773062.34001: calling self._execute() 9986 1726773062.34058: variable 'ansible_host' from source: host vars for 'managed_node3' 9986 1726773062.34065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9986 1726773062.34072: variable 'omit' from source: magic vars 9986 1726773062.34141: variable 'omit' from source: magic vars 9986 1726773062.34174: variable 'omit' from source: magic vars 9986 1726773062.34195: variable '__kernel_settings_active_profile' from source: set_fact 9986 1726773062.34409: variable '__kernel_settings_active_profile' from source: set_fact 9986 1726773062.34429: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 9986 1726773062.34479: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 9986 1726773062.34531: variable '__kernel_settings_tuned_dir' from source: role '' all vars 9986 1726773062.34553: variable 'omit' from source: magic vars 9986 1726773062.34584: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9986 1726773062.34858: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9986 1726773062.34880: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9986 1726773062.34896: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9986 1726773062.34906: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9986 1726773062.34929: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9986 1726773062.34934: variable 'ansible_host' from source: host vars for 'managed_node3' 9986 1726773062.34938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9986 1726773062.35013: Set connection var ansible_timeout to 10 9986 1726773062.35018: Set connection var ansible_shell_type to sh 9986 1726773062.35024: Set connection var ansible_module_compression to ZIP_DEFLATED 9986 1726773062.35030: Set connection var ansible_shell_executable to /bin/sh 9986 1726773062.35035: Set connection var ansible_pipelining to False 9986 1726773062.35041: Set connection var ansible_connection to ssh 9986 1726773062.35058: variable 'ansible_shell_executable' from source: unknown 9986 1726773062.35062: variable 'ansible_connection' from source: unknown 9986 1726773062.35066: variable 'ansible_module_compression' from source: unknown 9986 1726773062.35071: variable 'ansible_shell_type' from source: unknown 9986 1726773062.35074: variable 'ansible_shell_executable' from source: unknown 9986 1726773062.35077: variable 'ansible_host' from source: host vars for 'managed_node3' 9986 1726773062.35079: variable 'ansible_pipelining' from source: unknown 9986 1726773062.35081: variable 'ansible_timeout' from source: unknown 9986 1726773062.35083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9986 1726773062.35170: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9986 1726773062.35180: variable 'omit' from source: magic vars 9986 1726773062.35188: starting attempt loop 9986 1726773062.35190: running the handler 9986 1726773062.35200: _low_level_execute_command(): starting 9986 1726773062.35206: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9986 1726773062.37621: stdout chunk (state=2): >>>/root <<< 9986 1726773062.37739: stderr chunk (state=3): >>><<< 9986 1726773062.37746: stdout chunk (state=3): >>><<< 9986 1726773062.37767: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9986 1726773062.37782: _low_level_execute_command(): starting 9986 1726773062.37791: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773062.3777795-9986-113999413897008 `" && echo ansible-tmp-1726773062.3777795-9986-113999413897008="` echo /root/.ansible/tmp/ansible-tmp-1726773062.3777795-9986-113999413897008 `" ) && sleep 0' 9986 1726773062.40348: stdout chunk (state=2): >>>ansible-tmp-1726773062.3777795-9986-113999413897008=/root/.ansible/tmp/ansible-tmp-1726773062.3777795-9986-113999413897008 <<< 9986 1726773062.40490: stderr chunk (state=3): >>><<< 9986 1726773062.40497: stdout chunk (state=3): >>><<< 9986 1726773062.40512: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773062.3777795-9986-113999413897008=/root/.ansible/tmp/ansible-tmp-1726773062.3777795-9986-113999413897008 , stderr= 9986 1726773062.40588: variable 'ansible_module_compression' from source: unknown 9986 1726773062.40629: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 9986 1726773062.40657: variable 'ansible_facts' from source: unknown 9986 1726773062.40727: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773062.3777795-9986-113999413897008/AnsiballZ_stat.py 9986 1726773062.40816: Sending initial data 9986 1726773062.40823: Sent initial data (151 bytes) 9986 1726773062.43502: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmp_7b3iqc0 /root/.ansible/tmp/ansible-tmp-1726773062.3777795-9986-113999413897008/AnsiballZ_stat.py <<< 9986 1726773062.44682: stderr chunk (state=3): >>><<< 9986 1726773062.44693: stdout chunk (state=3): >>><<< 9986 1726773062.44712: done transferring module to remote 9986 1726773062.44723: _low_level_execute_command(): starting 9986 1726773062.44728: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773062.3777795-9986-113999413897008/ /root/.ansible/tmp/ansible-tmp-1726773062.3777795-9986-113999413897008/AnsiballZ_stat.py && sleep 0' 9986 1726773062.47260: stderr chunk (state=2): >>><<< 9986 1726773062.47274: stdout chunk (state=2): >>><<< 9986 1726773062.47294: _low_level_execute_command() done: rc=0, stdout=, stderr= 9986 1726773062.47301: _low_level_execute_command(): starting 9986 1726773062.47306: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773062.3777795-9986-113999413897008/AnsiballZ_stat.py && sleep 0' 9986 1726773062.63917: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 14, "inode": 79692034, "dev": 51713, "nlink": 1, "atime": 1726773062.2490964, "mtime": 1726773050.0270493, "ctime": 1726773050.2910504, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "mimetype": "text/plain", "charset": "us-ascii", "version": "2036816082", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 9986 1726773062.65087: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 9986 1726773062.65097: stdout chunk (state=3): >>><<< 9986 1726773062.65108: stderr chunk (state=3): >>><<< 9986 1726773062.65120: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 14, "inode": 79692034, "dev": 51713, "nlink": 1, "atime": 1726773062.2490964, "mtime": 1726773050.0270493, "ctime": 1726773050.2910504, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "mimetype": "text/plain", "charset": "us-ascii", "version": "2036816082", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.47.99 closed. 9986 1726773062.65160: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773062.3777795-9986-113999413897008/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9986 1726773062.65245: Sending initial data 9986 1726773062.65253: Sent initial data (140 bytes) 9986 1726773062.67973: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpyr1zw38n /root/.ansible/tmp/ansible-tmp-1726773062.3777795-9986-113999413897008/source <<< 9986 1726773062.68488: stderr chunk (state=3): >>><<< 9986 1726773062.68498: stdout chunk (state=3): >>><<< 9986 1726773062.68517: _low_level_execute_command(): starting 9986 1726773062.68523: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773062.3777795-9986-113999413897008/ /root/.ansible/tmp/ansible-tmp-1726773062.3777795-9986-113999413897008/source && sleep 0' 9986 1726773062.70987: stderr chunk (state=2): >>><<< 9986 1726773062.70999: stdout chunk (state=2): >>><<< 9986 1726773062.71017: _low_level_execute_command() done: rc=0, stdout=, stderr= 9986 1726773062.71040: variable 'ansible_module_compression' from source: unknown 9986 1726773062.71094: ANSIBALLZ: Using generic lock for ansible.legacy.copy 9986 1726773062.71100: ANSIBALLZ: Acquiring lock 9986 1726773062.71104: ANSIBALLZ: Lock acquired: 139792132305312 9986 1726773062.71107: ANSIBALLZ: Creating module 9986 1726773062.80929: ANSIBALLZ: Writing module into payload 9986 1726773062.81066: ANSIBALLZ: Writing module 9986 1726773062.81088: ANSIBALLZ: Renaming module 9986 1726773062.81096: ANSIBALLZ: Done creating module 9986 1726773062.81107: variable 'ansible_facts' from source: unknown 9986 1726773062.81159: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773062.3777795-9986-113999413897008/AnsiballZ_copy.py 9986 1726773062.81249: Sending initial data 9986 1726773062.81256: Sent initial data (151 bytes) 9986 1726773062.83913: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmp4r_6vf84 /root/.ansible/tmp/ansible-tmp-1726773062.3777795-9986-113999413897008/AnsiballZ_copy.py <<< 9986 1726773062.85117: stderr chunk (state=3): >>><<< 9986 1726773062.85126: stdout chunk (state=3): >>><<< 9986 1726773062.85146: done transferring module to remote 9986 1726773062.85157: _low_level_execute_command(): starting 9986 1726773062.85163: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773062.3777795-9986-113999413897008/ /root/.ansible/tmp/ansible-tmp-1726773062.3777795-9986-113999413897008/AnsiballZ_copy.py && sleep 0' 9986 1726773062.87644: stderr chunk (state=2): >>><<< 9986 1726773062.87653: stdout chunk (state=2): >>><<< 9986 1726773062.87667: _low_level_execute_command() done: rc=0, stdout=, stderr= 9986 1726773062.87674: _low_level_execute_command(): starting 9986 1726773062.87680: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773062.3777795-9986-113999413897008/AnsiballZ_copy.py && sleep 0' 9986 1726773063.04253: stdout chunk (state=2): >>> {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726773062.3777795-9986-113999413897008/source", "md5sum": "8d80fe3f09ba4b9ac8d7fd5e8541a324", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773062.3777795-9986-113999413897008/source", "_original_basename": "tmpyr1zw38n", "follow": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 9986 1726773063.05572: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 9986 1726773063.05584: stdout chunk (state=3): >>><<< 9986 1726773063.05599: stderr chunk (state=3): >>><<< 9986 1726773063.05612: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726773062.3777795-9986-113999413897008/source", "md5sum": "8d80fe3f09ba4b9ac8d7fd5e8541a324", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773062.3777795-9986-113999413897008/source", "_original_basename": "tmpyr1zw38n", "follow": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.47.99 closed. 9986 1726773063.05647: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/active_profile', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726773062.3777795-9986-113999413897008/source', '_original_basename': 'tmpyr1zw38n', 'follow': False, 'checksum': 'a79569d3860cb6a066e0e92c8b22ffd0e8796bfd', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773062.3777795-9986-113999413897008/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9986 1726773063.05659: _low_level_execute_command(): starting 9986 1726773063.05665: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773062.3777795-9986-113999413897008/ > /dev/null 2>&1 && sleep 0' 9986 1726773063.08256: stderr chunk (state=2): >>><<< 9986 1726773063.08266: stdout chunk (state=2): >>><<< 9986 1726773063.08284: _low_level_execute_command() done: rc=0, stdout=, stderr= 9986 1726773063.08295: handler run complete 9986 1726773063.08314: attempt loop complete, returning result 9986 1726773063.08318: _execute() done 9986 1726773063.08321: dumping result to json 9986 1726773063.08327: done dumping result, returning 9986 1726773063.08334: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile [0affffe7-6841-7dd6-8fa6-000000000056] 9986 1726773063.08339: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000056 9986 1726773063.08371: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000056 9986 1726773063.08375: WORKER PROCESS EXITING changed: [managed_node3] => { "changed": true, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "md5sum": "8d80fe3f09ba4b9ac8d7fd5e8541a324", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "src": "/root/.ansible/tmp/ansible-tmp-1726773062.3777795-9986-113999413897008/source", "state": "file", "uid": 0 } 9733 1726773063.08528: no more pending results, returning what we have 9733 1726773063.08531: results queue empty 9733 1726773063.08531: checking for any_errors_fatal 9733 1726773063.08536: done checking for any_errors_fatal 9733 1726773063.08537: checking for max_fail_percentage 9733 1726773063.08538: done checking for max_fail_percentage 9733 1726773063.08538: checking to see if all hosts have failed and the running result is not ok 9733 1726773063.08539: done checking to see if all hosts have failed 9733 1726773063.08539: getting the remaining hosts for this loop 9733 1726773063.08540: done getting the remaining hosts for this loop 9733 1726773063.08544: getting the next task for host managed_node3 9733 1726773063.08549: done getting next task for host managed_node3 9733 1726773063.08554: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 9733 1726773063.08556: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773063.08564: getting variables 9733 1726773063.08566: in VariableManager get_vars() 9733 1726773063.08600: Calling all_inventory to load vars for managed_node3 9733 1726773063.08603: Calling groups_inventory to load vars for managed_node3 9733 1726773063.08605: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773063.08614: Calling all_plugins_play to load vars for managed_node3 9733 1726773063.08616: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773063.08619: Calling groups_plugins_play to load vars for managed_node3 9733 1726773063.08931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773063.09046: done with get_vars() 9733 1726773063.09053: done getting variables 9733 1726773063.09096: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set profile_mode to manual] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 Thursday 19 September 2024 15:11:03 -0400 (0:00:00.755) 0:00:08.823 **** 9733 1726773063.09117: entering _queue_task() for managed_node3/copy 9733 1726773063.09296: worker is 1 (out of 1 available) 9733 1726773063.09311: exiting _queue_task() for managed_node3/copy 9733 1726773063.09322: done queuing things up, now waiting for results queue to drain 9733 1726773063.09324: waiting for pending results... 10025 1726773063.09441: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 10025 1726773063.09554: in run() - task 0affffe7-6841-7dd6-8fa6-000000000057 10025 1726773063.09571: variable 'ansible_search_path' from source: unknown 10025 1726773063.09576: variable 'ansible_search_path' from source: unknown 10025 1726773063.09607: calling self._execute() 10025 1726773063.09668: variable 'ansible_host' from source: host vars for 'managed_node3' 10025 1726773063.09677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10025 1726773063.09688: variable 'omit' from source: magic vars 10025 1726773063.09759: variable 'omit' from source: magic vars 10025 1726773063.09796: variable 'omit' from source: magic vars 10025 1726773063.09818: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 10025 1726773063.10042: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 10025 1726773063.10103: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10025 1726773063.10129: variable 'omit' from source: magic vars 10025 1726773063.10159: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10025 1726773063.10186: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10025 1726773063.10204: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10025 1726773063.10217: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10025 1726773063.10226: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10025 1726773063.10247: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10025 1726773063.10251: variable 'ansible_host' from source: host vars for 'managed_node3' 10025 1726773063.10253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10025 1726773063.10343: Set connection var ansible_timeout to 10 10025 1726773063.10348: Set connection var ansible_shell_type to sh 10025 1726773063.10354: Set connection var ansible_module_compression to ZIP_DEFLATED 10025 1726773063.10360: Set connection var ansible_shell_executable to /bin/sh 10025 1726773063.10365: Set connection var ansible_pipelining to False 10025 1726773063.10372: Set connection var ansible_connection to ssh 10025 1726773063.10391: variable 'ansible_shell_executable' from source: unknown 10025 1726773063.10396: variable 'ansible_connection' from source: unknown 10025 1726773063.10399: variable 'ansible_module_compression' from source: unknown 10025 1726773063.10400: variable 'ansible_shell_type' from source: unknown 10025 1726773063.10402: variable 'ansible_shell_executable' from source: unknown 10025 1726773063.10404: variable 'ansible_host' from source: host vars for 'managed_node3' 10025 1726773063.10406: variable 'ansible_pipelining' from source: unknown 10025 1726773063.10407: variable 'ansible_timeout' from source: unknown 10025 1726773063.10409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10025 1726773063.10511: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10025 1726773063.10522: variable 'omit' from source: magic vars 10025 1726773063.10529: starting attempt loop 10025 1726773063.10532: running the handler 10025 1726773063.10543: _low_level_execute_command(): starting 10025 1726773063.10551: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10025 1726773063.12954: stdout chunk (state=2): >>>/root <<< 10025 1726773063.13082: stderr chunk (state=3): >>><<< 10025 1726773063.13091: stdout chunk (state=3): >>><<< 10025 1726773063.13111: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10025 1726773063.13123: _low_level_execute_command(): starting 10025 1726773063.13129: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773063.1311862-10025-183978371147519 `" && echo ansible-tmp-1726773063.1311862-10025-183978371147519="` echo /root/.ansible/tmp/ansible-tmp-1726773063.1311862-10025-183978371147519 `" ) && sleep 0' 10025 1726773063.15675: stdout chunk (state=2): >>>ansible-tmp-1726773063.1311862-10025-183978371147519=/root/.ansible/tmp/ansible-tmp-1726773063.1311862-10025-183978371147519 <<< 10025 1726773063.15810: stderr chunk (state=3): >>><<< 10025 1726773063.15819: stdout chunk (state=3): >>><<< 10025 1726773063.15837: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773063.1311862-10025-183978371147519=/root/.ansible/tmp/ansible-tmp-1726773063.1311862-10025-183978371147519 , stderr= 10025 1726773063.15910: variable 'ansible_module_compression' from source: unknown 10025 1726773063.15958: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10025 1726773063.15993: variable 'ansible_facts' from source: unknown 10025 1726773063.16064: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773063.1311862-10025-183978371147519/AnsiballZ_stat.py 10025 1726773063.16155: Sending initial data 10025 1726773063.16163: Sent initial data (152 bytes) 10025 1726773063.18788: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmp172x0ehg /root/.ansible/tmp/ansible-tmp-1726773063.1311862-10025-183978371147519/AnsiballZ_stat.py <<< 10025 1726773063.19972: stderr chunk (state=3): >>><<< 10025 1726773063.19981: stdout chunk (state=3): >>><<< 10025 1726773063.20002: done transferring module to remote 10025 1726773063.20013: _low_level_execute_command(): starting 10025 1726773063.20019: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773063.1311862-10025-183978371147519/ /root/.ansible/tmp/ansible-tmp-1726773063.1311862-10025-183978371147519/AnsiballZ_stat.py && sleep 0' 10025 1726773063.22591: stderr chunk (state=2): >>><<< 10025 1726773063.22600: stdout chunk (state=2): >>><<< 10025 1726773063.22616: _low_level_execute_command() done: rc=0, stdout=, stderr= 10025 1726773063.22621: _low_level_execute_command(): starting 10025 1726773063.22627: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773063.1311862-10025-183978371147519/AnsiballZ_stat.py && sleep 0' 10025 1726773063.39226: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 5, "inode": 85983490, "dev": 51713, "nlink": 1, "atime": 1726773050.952053, "mtime": 1726773050.692052, "ctime": 1726773050.953053, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "mimetype": "text/plain", "charset": "us-ascii", "version": "2189578446", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 10025 1726773063.40414: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10025 1726773063.40457: stderr chunk (state=3): >>><<< 10025 1726773063.40466: stdout chunk (state=3): >>><<< 10025 1726773063.40485: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 5, "inode": 85983490, "dev": 51713, "nlink": 1, "atime": 1726773050.952053, "mtime": 1726773050.692052, "ctime": 1726773050.953053, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "mimetype": "text/plain", "charset": "us-ascii", "version": "2189578446", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.47.99 closed. 10025 1726773063.40529: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773063.1311862-10025-183978371147519/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10025 1726773063.40616: Sending initial data 10025 1726773063.40625: Sent initial data (141 bytes) 10025 1726773063.43275: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmptyy8zh9b /root/.ansible/tmp/ansible-tmp-1726773063.1311862-10025-183978371147519/source <<< 10025 1726773063.43706: stderr chunk (state=3): >>><<< 10025 1726773063.43714: stdout chunk (state=3): >>><<< 10025 1726773063.43734: _low_level_execute_command(): starting 10025 1726773063.43741: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773063.1311862-10025-183978371147519/ /root/.ansible/tmp/ansible-tmp-1726773063.1311862-10025-183978371147519/source && sleep 0' 10025 1726773063.46148: stderr chunk (state=2): >>><<< 10025 1726773063.46156: stdout chunk (state=2): >>><<< 10025 1726773063.46174: _low_level_execute_command() done: rc=0, stdout=, stderr= 10025 1726773063.46196: variable 'ansible_module_compression' from source: unknown 10025 1726773063.46232: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 10025 1726773063.46255: variable 'ansible_facts' from source: unknown 10025 1726773063.46315: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773063.1311862-10025-183978371147519/AnsiballZ_copy.py 10025 1726773063.46410: Sending initial data 10025 1726773063.46417: Sent initial data (152 bytes) 10025 1726773063.49001: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmp8fm04558 /root/.ansible/tmp/ansible-tmp-1726773063.1311862-10025-183978371147519/AnsiballZ_copy.py <<< 10025 1726773063.50220: stderr chunk (state=3): >>><<< 10025 1726773063.50230: stdout chunk (state=3): >>><<< 10025 1726773063.50248: done transferring module to remote 10025 1726773063.50256: _low_level_execute_command(): starting 10025 1726773063.50260: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773063.1311862-10025-183978371147519/ /root/.ansible/tmp/ansible-tmp-1726773063.1311862-10025-183978371147519/AnsiballZ_copy.py && sleep 0' 10025 1726773063.52723: stderr chunk (state=2): >>><<< 10025 1726773063.52733: stdout chunk (state=2): >>><<< 10025 1726773063.52748: _low_level_execute_command() done: rc=0, stdout=, stderr= 10025 1726773063.52752: _low_level_execute_command(): starting 10025 1726773063.52758: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773063.1311862-10025-183978371147519/AnsiballZ_copy.py && sleep 0' 10025 1726773063.69232: stdout chunk (state=2): >>> {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726773063.1311862-10025-183978371147519/source", "md5sum": "cf3f2a865fbea819dadd439586eaee31", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773063.1311862-10025-183978371147519/source", "_original_basename": "tmptyy8zh9b", "follow": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 10025 1726773063.70415: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10025 1726773063.70462: stderr chunk (state=3): >>><<< 10025 1726773063.70472: stdout chunk (state=3): >>><<< 10025 1726773063.70490: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726773063.1311862-10025-183978371147519/source", "md5sum": "cf3f2a865fbea819dadd439586eaee31", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773063.1311862-10025-183978371147519/source", "_original_basename": "tmptyy8zh9b", "follow": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.47.99 closed. 10025 1726773063.70517: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/profile_mode', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726773063.1311862-10025-183978371147519/source', '_original_basename': 'tmptyy8zh9b', 'follow': False, 'checksum': '3ef9f23deed2e23d3ef2b88b842fb882313e15ce', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773063.1311862-10025-183978371147519/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10025 1726773063.70528: _low_level_execute_command(): starting 10025 1726773063.70535: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773063.1311862-10025-183978371147519/ > /dev/null 2>&1 && sleep 0' 10025 1726773063.73021: stderr chunk (state=2): >>><<< 10025 1726773063.73030: stdout chunk (state=2): >>><<< 10025 1726773063.73049: _low_level_execute_command() done: rc=0, stdout=, stderr= 10025 1726773063.73063: handler run complete 10025 1726773063.73084: attempt loop complete, returning result 10025 1726773063.73091: _execute() done 10025 1726773063.73095: dumping result to json 10025 1726773063.73102: done dumping result, returning 10025 1726773063.73110: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual [0affffe7-6841-7dd6-8fa6-000000000057] 10025 1726773063.73115: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000057 10025 1726773063.73148: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000057 10025 1726773063.73152: WORKER PROCESS EXITING changed: [managed_node3] => { "changed": true, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "md5sum": "cf3f2a865fbea819dadd439586eaee31", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "src": "/root/.ansible/tmp/ansible-tmp-1726773063.1311862-10025-183978371147519/source", "state": "file", "uid": 0 } 9733 1726773063.73367: no more pending results, returning what we have 9733 1726773063.73370: results queue empty 9733 1726773063.73371: checking for any_errors_fatal 9733 1726773063.73376: done checking for any_errors_fatal 9733 1726773063.73376: checking for max_fail_percentage 9733 1726773063.73377: done checking for max_fail_percentage 9733 1726773063.73378: checking to see if all hosts have failed and the running result is not ok 9733 1726773063.73378: done checking to see if all hosts have failed 9733 1726773063.73379: getting the remaining hosts for this loop 9733 1726773063.73380: done getting the remaining hosts for this loop 9733 1726773063.73383: getting the next task for host managed_node3 9733 1726773063.73390: done getting next task for host managed_node3 9733 1726773063.73393: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config 9733 1726773063.73394: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773063.73403: getting variables 9733 1726773063.73404: in VariableManager get_vars() 9733 1726773063.73428: Calling all_inventory to load vars for managed_node3 9733 1726773063.73429: Calling groups_inventory to load vars for managed_node3 9733 1726773063.73431: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773063.73437: Calling all_plugins_play to load vars for managed_node3 9733 1726773063.73439: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773063.73440: Calling groups_plugins_play to load vars for managed_node3 9733 1726773063.73545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773063.73668: done with get_vars() 9733 1726773063.73677: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get current config] ********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:107 Thursday 19 September 2024 15:11:03 -0400 (0:00:00.646) 0:00:09.469 **** 9733 1726773063.73736: entering _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config 9733 1726773063.73900: worker is 1 (out of 1 available) 9733 1726773063.73913: exiting _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config 9733 1726773063.73924: done queuing things up, now waiting for results queue to drain 9733 1726773063.73926: waiting for pending results... 10055 1726773063.74051: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get current config 10055 1726773063.74154: in run() - task 0affffe7-6841-7dd6-8fa6-000000000058 10055 1726773063.74173: variable 'ansible_search_path' from source: unknown 10055 1726773063.74177: variable 'ansible_search_path' from source: unknown 10055 1726773063.74207: calling self._execute() 10055 1726773063.74271: variable 'ansible_host' from source: host vars for 'managed_node3' 10055 1726773063.74280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10055 1726773063.74290: variable 'omit' from source: magic vars 10055 1726773063.74362: variable 'omit' from source: magic vars 10055 1726773063.74402: variable 'omit' from source: magic vars 10055 1726773063.74423: variable '__kernel_settings_profile_filename' from source: role '' all vars 10055 1726773063.74645: variable '__kernel_settings_profile_filename' from source: role '' all vars 10055 1726773063.74707: variable '__kernel_settings_profile_dir' from source: role '' all vars 10055 1726773063.74771: variable '__kernel_settings_profile_parent' from source: set_fact 10055 1726773063.74780: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10055 1726773063.74872: variable 'omit' from source: magic vars 10055 1726773063.74907: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10055 1726773063.74935: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10055 1726773063.74958: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10055 1726773063.74982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10055 1726773063.75002: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10055 1726773063.75029: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10055 1726773063.75035: variable 'ansible_host' from source: host vars for 'managed_node3' 10055 1726773063.75039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10055 1726773063.75139: Set connection var ansible_timeout to 10 10055 1726773063.75145: Set connection var ansible_shell_type to sh 10055 1726773063.75151: Set connection var ansible_module_compression to ZIP_DEFLATED 10055 1726773063.75156: Set connection var ansible_shell_executable to /bin/sh 10055 1726773063.75162: Set connection var ansible_pipelining to False 10055 1726773063.75171: Set connection var ansible_connection to ssh 10055 1726773063.75193: variable 'ansible_shell_executable' from source: unknown 10055 1726773063.75198: variable 'ansible_connection' from source: unknown 10055 1726773063.75202: variable 'ansible_module_compression' from source: unknown 10055 1726773063.75205: variable 'ansible_shell_type' from source: unknown 10055 1726773063.75207: variable 'ansible_shell_executable' from source: unknown 10055 1726773063.75210: variable 'ansible_host' from source: host vars for 'managed_node3' 10055 1726773063.75214: variable 'ansible_pipelining' from source: unknown 10055 1726773063.75217: variable 'ansible_timeout' from source: unknown 10055 1726773063.75221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10055 1726773063.75392: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10055 1726773063.75404: variable 'omit' from source: magic vars 10055 1726773063.75410: starting attempt loop 10055 1726773063.75413: running the handler 10055 1726773063.75425: _low_level_execute_command(): starting 10055 1726773063.75433: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10055 1726773063.77897: stdout chunk (state=2): >>>/root <<< 10055 1726773063.78016: stderr chunk (state=3): >>><<< 10055 1726773063.78023: stdout chunk (state=3): >>><<< 10055 1726773063.78041: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10055 1726773063.78054: _low_level_execute_command(): starting 10055 1726773063.78060: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773063.780491-10055-86877754735149 `" && echo ansible-tmp-1726773063.780491-10055-86877754735149="` echo /root/.ansible/tmp/ansible-tmp-1726773063.780491-10055-86877754735149 `" ) && sleep 0' 10055 1726773063.80574: stdout chunk (state=2): >>>ansible-tmp-1726773063.780491-10055-86877754735149=/root/.ansible/tmp/ansible-tmp-1726773063.780491-10055-86877754735149 <<< 10055 1726773063.80703: stderr chunk (state=3): >>><<< 10055 1726773063.80716: stdout chunk (state=3): >>><<< 10055 1726773063.80728: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773063.780491-10055-86877754735149=/root/.ansible/tmp/ansible-tmp-1726773063.780491-10055-86877754735149 , stderr= 10055 1726773063.80770: variable 'ansible_module_compression' from source: unknown 10055 1726773063.80804: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 10055 1726773063.80836: variable 'ansible_facts' from source: unknown 10055 1726773063.80903: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773063.780491-10055-86877754735149/AnsiballZ_kernel_settings_get_config.py 10055 1726773063.81002: Sending initial data 10055 1726773063.81010: Sent initial data (172 bytes) 10055 1726773063.83584: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpunhpifs8 /root/.ansible/tmp/ansible-tmp-1726773063.780491-10055-86877754735149/AnsiballZ_kernel_settings_get_config.py <<< 10055 1726773063.84747: stderr chunk (state=3): >>><<< 10055 1726773063.84754: stdout chunk (state=3): >>><<< 10055 1726773063.84775: done transferring module to remote 10055 1726773063.84787: _low_level_execute_command(): starting 10055 1726773063.84793: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773063.780491-10055-86877754735149/ /root/.ansible/tmp/ansible-tmp-1726773063.780491-10055-86877754735149/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10055 1726773063.87205: stderr chunk (state=2): >>><<< 10055 1726773063.87217: stdout chunk (state=2): >>><<< 10055 1726773063.87232: _low_level_execute_command() done: rc=0, stdout=, stderr= 10055 1726773063.87236: _low_level_execute_command(): starting 10055 1726773063.87241: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773063.780491-10055-86877754735149/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10055 1726773064.02873: stdout chunk (state=2): >>> {"changed": false, "data": {}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} <<< 10055 1726773064.03942: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10055 1726773064.03955: stdout chunk (state=3): >>><<< 10055 1726773064.03965: stderr chunk (state=3): >>><<< 10055 1726773064.03977: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} , stderr=Shared connection to 10.31.47.99 closed. 10055 1726773064.03998: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773063.780491-10055-86877754735149/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10055 1726773064.04010: _low_level_execute_command(): starting 10055 1726773064.04016: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773063.780491-10055-86877754735149/ > /dev/null 2>&1 && sleep 0' 10055 1726773064.06455: stderr chunk (state=2): >>><<< 10055 1726773064.06464: stdout chunk (state=2): >>><<< 10055 1726773064.06479: _low_level_execute_command() done: rc=0, stdout=, stderr= 10055 1726773064.06487: handler run complete 10055 1726773064.06502: attempt loop complete, returning result 10055 1726773064.06506: _execute() done 10055 1726773064.06510: dumping result to json 10055 1726773064.06514: done dumping result, returning 10055 1726773064.06521: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get current config [0affffe7-6841-7dd6-8fa6-000000000058] 10055 1726773064.06532: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000058 10055 1726773064.06563: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000058 10055 1726773064.06566: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "data": {} } 9733 1726773064.06694: no more pending results, returning what we have 9733 1726773064.06697: results queue empty 9733 1726773064.06698: checking for any_errors_fatal 9733 1726773064.06703: done checking for any_errors_fatal 9733 1726773064.06704: checking for max_fail_percentage 9733 1726773064.06705: done checking for max_fail_percentage 9733 1726773064.06706: checking to see if all hosts have failed and the running result is not ok 9733 1726773064.06706: done checking to see if all hosts have failed 9733 1726773064.06707: getting the remaining hosts for this loop 9733 1726773064.06708: done getting the remaining hosts for this loop 9733 1726773064.06711: getting the next task for host managed_node3 9733 1726773064.06715: done getting next task for host managed_node3 9733 1726773064.06718: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 9733 1726773064.06720: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773064.06729: getting variables 9733 1726773064.06730: in VariableManager get_vars() 9733 1726773064.06761: Calling all_inventory to load vars for managed_node3 9733 1726773064.06764: Calling groups_inventory to load vars for managed_node3 9733 1726773064.06765: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773064.06776: Calling all_plugins_play to load vars for managed_node3 9733 1726773064.06779: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773064.06781: Calling groups_plugins_play to load vars for managed_node3 9733 1726773064.06944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773064.07059: done with get_vars() 9733 1726773064.07067: done getting variables 9733 1726773064.07151: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Apply kernel settings] ******* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 Thursday 19 September 2024 15:11:04 -0400 (0:00:00.334) 0:00:09.804 **** 9733 1726773064.07174: entering _queue_task() for managed_node3/template 9733 1726773064.07175: Creating lock for template 9733 1726773064.07344: worker is 1 (out of 1 available) 9733 1726773064.07359: exiting _queue_task() for managed_node3/template 9733 1726773064.07374: done queuing things up, now waiting for results queue to drain 9733 1726773064.07376: waiting for pending results... 10074 1726773064.07491: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 10074 1726773064.07597: in run() - task 0affffe7-6841-7dd6-8fa6-000000000059 10074 1726773064.07614: variable 'ansible_search_path' from source: unknown 10074 1726773064.07618: variable 'ansible_search_path' from source: unknown 10074 1726773064.07648: calling self._execute() 10074 1726773064.07715: variable 'ansible_host' from source: host vars for 'managed_node3' 10074 1726773064.07722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10074 1726773064.07728: variable 'omit' from source: magic vars 10074 1726773064.07799: variable 'omit' from source: magic vars 10074 1726773064.07836: variable 'omit' from source: magic vars 10074 1726773064.08074: variable '__kernel_settings_profile_src' from source: role '' all vars 10074 1726773064.08084: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10074 1726773064.08140: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10074 1726773064.08161: variable '__kernel_settings_profile_filename' from source: role '' all vars 10074 1726773064.08208: variable '__kernel_settings_profile_filename' from source: role '' all vars 10074 1726773064.08258: variable '__kernel_settings_profile_dir' from source: role '' all vars 10074 1726773064.08318: variable '__kernel_settings_profile_parent' from source: set_fact 10074 1726773064.08327: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10074 1726773064.08351: variable 'omit' from source: magic vars 10074 1726773064.08386: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10074 1726773064.08413: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10074 1726773064.08430: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10074 1726773064.08444: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10074 1726773064.08455: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10074 1726773064.08482: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10074 1726773064.08489: variable 'ansible_host' from source: host vars for 'managed_node3' 10074 1726773064.08493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10074 1726773064.08558: Set connection var ansible_timeout to 10 10074 1726773064.08563: Set connection var ansible_shell_type to sh 10074 1726773064.08569: Set connection var ansible_module_compression to ZIP_DEFLATED 10074 1726773064.08574: Set connection var ansible_shell_executable to /bin/sh 10074 1726773064.08580: Set connection var ansible_pipelining to False 10074 1726773064.08588: Set connection var ansible_connection to ssh 10074 1726773064.08603: variable 'ansible_shell_executable' from source: unknown 10074 1726773064.08605: variable 'ansible_connection' from source: unknown 10074 1726773064.08607: variable 'ansible_module_compression' from source: unknown 10074 1726773064.08609: variable 'ansible_shell_type' from source: unknown 10074 1726773064.08610: variable 'ansible_shell_executable' from source: unknown 10074 1726773064.08612: variable 'ansible_host' from source: host vars for 'managed_node3' 10074 1726773064.08614: variable 'ansible_pipelining' from source: unknown 10074 1726773064.08616: variable 'ansible_timeout' from source: unknown 10074 1726773064.08618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10074 1726773064.08707: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10074 1726773064.08717: variable 'omit' from source: magic vars 10074 1726773064.08721: starting attempt loop 10074 1726773064.08723: running the handler 10074 1726773064.08732: _low_level_execute_command(): starting 10074 1726773064.08738: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10074 1726773064.11110: stdout chunk (state=2): >>>/root <<< 10074 1726773064.11234: stderr chunk (state=3): >>><<< 10074 1726773064.11243: stdout chunk (state=3): >>><<< 10074 1726773064.11262: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10074 1726773064.11275: _low_level_execute_command(): starting 10074 1726773064.11281: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773064.112697-10074-205341948065033 `" && echo ansible-tmp-1726773064.112697-10074-205341948065033="` echo /root/.ansible/tmp/ansible-tmp-1726773064.112697-10074-205341948065033 `" ) && sleep 0' 10074 1726773064.13812: stdout chunk (state=2): >>>ansible-tmp-1726773064.112697-10074-205341948065033=/root/.ansible/tmp/ansible-tmp-1726773064.112697-10074-205341948065033 <<< 10074 1726773064.13937: stderr chunk (state=3): >>><<< 10074 1726773064.13944: stdout chunk (state=3): >>><<< 10074 1726773064.13960: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773064.112697-10074-205341948065033=/root/.ansible/tmp/ansible-tmp-1726773064.112697-10074-205341948065033 , stderr= 10074 1726773064.13977: evaluation_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks 10074 1726773064.13998: search_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/templates/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/templates/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/templates/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/kernel_settings.j2 10074 1726773064.14018: variable 'ansible_search_path' from source: unknown 10074 1726773064.14604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10074 1726773064.16043: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10074 1726773064.16101: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10074 1726773064.16130: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10074 1726773064.16157: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10074 1726773064.16179: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10074 1726773064.16369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10074 1726773064.16396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10074 1726773064.16418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10074 1726773064.16446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10074 1726773064.16459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10074 1726773064.16689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10074 1726773064.16707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10074 1726773064.16724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10074 1726773064.16751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10074 1726773064.16762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10074 1726773064.17011: variable 'ansible_managed' from source: unknown 10074 1726773064.17018: variable '__sections' from source: task vars 10074 1726773064.17106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10074 1726773064.17126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10074 1726773064.17145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10074 1726773064.17171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10074 1726773064.17182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10074 1726773064.17255: variable 'kernel_settings_sysctl' from source: include_vars 10074 1726773064.17266: variable '__kernel_settings_state_empty' from source: role '' all vars 10074 1726773064.17272: variable '__kernel_settings_previous_replaced' from source: role '' all vars 10074 1726773064.17319: variable '__sysctl_old' from source: task vars 10074 1726773064.17365: variable '__sysctl_old' from source: task vars 10074 1726773064.17505: variable 'kernel_settings_purge' from source: role '' defaults 10074 1726773064.17511: variable 'kernel_settings_sysctl' from source: include_vars 10074 1726773064.17520: variable '__kernel_settings_state_empty' from source: role '' all vars 10074 1726773064.17524: variable '__kernel_settings_previous_replaced' from source: role '' all vars 10074 1726773064.17529: variable '__kernel_settings_profile_contents' from source: set_fact 10074 1726773064.17654: variable 'kernel_settings_sysfs' from source: include_vars 10074 1726773064.17664: variable '__kernel_settings_state_empty' from source: role '' all vars 10074 1726773064.17668: variable '__kernel_settings_previous_replaced' from source: role '' all vars 10074 1726773064.17698: variable '__sysfs_old' from source: task vars 10074 1726773064.17739: variable '__sysfs_old' from source: task vars 10074 1726773064.17874: variable 'kernel_settings_purge' from source: role '' defaults 10074 1726773064.17881: variable 'kernel_settings_sysfs' from source: include_vars 10074 1726773064.17891: variable '__kernel_settings_state_empty' from source: role '' all vars 10074 1726773064.17896: variable '__kernel_settings_previous_replaced' from source: role '' all vars 10074 1726773064.17901: variable '__kernel_settings_profile_contents' from source: set_fact 10074 1726773064.17916: variable 'kernel_settings_systemd_cpu_affinity' from source: include_vars 10074 1726773064.17958: variable 'kernel_settings_systemd_cpu_affinity' from source: include_vars 10074 1726773064.17974: variable '__systemd_old' from source: task vars 10074 1726773064.18017: variable '__systemd_old' from source: task vars 10074 1726773064.18145: variable 'kernel_settings_purge' from source: role '' defaults 10074 1726773064.18150: variable 'kernel_settings_systemd_cpu_affinity' from source: include_vars 10074 1726773064.18154: variable '__kernel_settings_state_absent' from source: role '' all vars 10074 1726773064.18157: variable '__kernel_settings_profile_contents' from source: set_fact 10074 1726773064.18164: variable 'kernel_settings_transparent_hugepages' from source: include_vars 10074 1726773064.18205: variable 'kernel_settings_transparent_hugepages' from source: include_vars 10074 1726773064.18214: variable 'kernel_settings_transparent_hugepages_defrag' from source: include_vars 10074 1726773064.18252: variable 'kernel_settings_transparent_hugepages_defrag' from source: include_vars 10074 1726773064.18264: variable '__trans_huge_old' from source: task vars 10074 1726773064.18313: variable '__trans_huge_old' from source: task vars 10074 1726773064.18462: variable 'kernel_settings_purge' from source: role '' defaults 10074 1726773064.18472: variable 'kernel_settings_transparent_hugepages' from source: include_vars 10074 1726773064.18477: variable '__kernel_settings_state_absent' from source: role '' all vars 10074 1726773064.18483: variable '__kernel_settings_profile_contents' from source: set_fact 10074 1726773064.18494: variable '__trans_defrag_old' from source: task vars 10074 1726773064.18536: variable '__trans_defrag_old' from source: task vars 10074 1726773064.18665: variable 'kernel_settings_purge' from source: role '' defaults 10074 1726773064.18674: variable 'kernel_settings_transparent_hugepages_defrag' from source: include_vars 10074 1726773064.18681: variable '__kernel_settings_state_absent' from source: role '' all vars 10074 1726773064.18687: variable '__kernel_settings_profile_contents' from source: set_fact 10074 1726773064.18701: variable '__kernel_settings_state_absent' from source: role '' all vars 10074 1726773064.18712: variable '__kernel_settings_state_absent' from source: role '' all vars 10074 1726773064.18719: variable '__kernel_settings_state_absent' from source: role '' all vars 10074 1726773064.18726: variable '__kernel_settings_state_absent' from source: role '' all vars 10074 1726773064.18733: variable '__kernel_settings_state_absent' from source: role '' all vars 10074 1726773064.18740: variable '__kernel_settings_state_absent' from source: role '' all vars 10074 1726773064.18748: variable '__kernel_settings_state_absent' from source: role '' all vars 10074 1726773064.18755: variable '__kernel_settings_state_absent' from source: role '' all vars 10074 1726773064.18761: variable '__kernel_settings_state_absent' from source: role '' all vars 10074 1726773064.18765: variable '__kernel_settings_state_absent' from source: role '' all vars 10074 1726773064.18778: variable '__kernel_settings_state_absent' from source: role '' all vars 10074 1726773064.18787: variable '__kernel_settings_state_absent' from source: role '' all vars 10074 1726773064.18792: variable '__kernel_settings_state_absent' from source: role '' all vars 10074 1726773064.18797: variable '__kernel_settings_state_absent' from source: role '' all vars 10074 1726773064.18801: variable '__kernel_settings_state_absent' from source: role '' all vars 10074 1726773064.18805: variable '__kernel_settings_state_absent' from source: role '' all vars 10074 1726773064.18809: variable '__kernel_settings_state_absent' from source: role '' all vars 10074 1726773064.18813: variable '__kernel_settings_state_absent' from source: role '' all vars 10074 1726773064.18816: variable '__kernel_settings_state_absent' from source: role '' all vars 10074 1726773064.18819: variable '__kernel_settings_state_absent' from source: role '' all vars 10074 1726773064.19259: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10074 1726773064.19309: variable 'ansible_module_compression' from source: unknown 10074 1726773064.19348: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10074 1726773064.19377: variable 'ansible_facts' from source: unknown 10074 1726773064.19445: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773064.112697-10074-205341948065033/AnsiballZ_stat.py 10074 1726773064.19539: Sending initial data 10074 1726773064.19546: Sent initial data (151 bytes) 10074 1726773064.22250: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpk4yp1dl2 /root/.ansible/tmp/ansible-tmp-1726773064.112697-10074-205341948065033/AnsiballZ_stat.py <<< 10074 1726773064.23426: stderr chunk (state=3): >>><<< 10074 1726773064.23443: stdout chunk (state=3): >>><<< 10074 1726773064.23472: done transferring module to remote 10074 1726773064.23484: _low_level_execute_command(): starting 10074 1726773064.23492: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773064.112697-10074-205341948065033/ /root/.ansible/tmp/ansible-tmp-1726773064.112697-10074-205341948065033/AnsiballZ_stat.py && sleep 0' 10074 1726773064.26175: stderr chunk (state=2): >>><<< 10074 1726773064.26187: stdout chunk (state=2): >>><<< 10074 1726773064.26204: _low_level_execute_command() done: rc=0, stdout=, stderr= 10074 1726773064.26208: _low_level_execute_command(): starting 10074 1726773064.26213: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773064.112697-10074-205341948065033/AnsiballZ_stat.py && sleep 0' 10074 1726773064.41587: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 10074 1726773064.42623: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10074 1726773064.42667: stderr chunk (state=3): >>><<< 10074 1726773064.42677: stdout chunk (state=3): >>><<< 10074 1726773064.42694: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.47.99 closed. 10074 1726773064.42716: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/kernel_settings/tuned.conf', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773064.112697-10074-205341948065033/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10074 1726773064.42804: Sending initial data 10074 1726773064.42812: Sent initial data (159 bytes) 10074 1726773064.45420: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmp7r_x_2a3/kernel_settings.j2 /root/.ansible/tmp/ansible-tmp-1726773064.112697-10074-205341948065033/source <<< 10074 1726773064.45863: stderr chunk (state=3): >>><<< 10074 1726773064.45873: stdout chunk (state=3): >>><<< 10074 1726773064.45889: _low_level_execute_command(): starting 10074 1726773064.45894: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773064.112697-10074-205341948065033/ /root/.ansible/tmp/ansible-tmp-1726773064.112697-10074-205341948065033/source && sleep 0' 10074 1726773064.48331: stderr chunk (state=2): >>><<< 10074 1726773064.48344: stdout chunk (state=2): >>><<< 10074 1726773064.48361: _low_level_execute_command() done: rc=0, stdout=, stderr= 10074 1726773064.48387: variable 'ansible_module_compression' from source: unknown 10074 1726773064.48422: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 10074 1726773064.48441: variable 'ansible_facts' from source: unknown 10074 1726773064.48507: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773064.112697-10074-205341948065033/AnsiballZ_copy.py 10074 1726773064.48600: Sending initial data 10074 1726773064.48607: Sent initial data (151 bytes) 10074 1726773064.51196: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpu2iwrxq4 /root/.ansible/tmp/ansible-tmp-1726773064.112697-10074-205341948065033/AnsiballZ_copy.py <<< 10074 1726773064.52415: stderr chunk (state=3): >>><<< 10074 1726773064.52423: stdout chunk (state=3): >>><<< 10074 1726773064.52443: done transferring module to remote 10074 1726773064.52453: _low_level_execute_command(): starting 10074 1726773064.52459: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773064.112697-10074-205341948065033/ /root/.ansible/tmp/ansible-tmp-1726773064.112697-10074-205341948065033/AnsiballZ_copy.py && sleep 0' 10074 1726773064.54889: stderr chunk (state=2): >>><<< 10074 1726773064.54898: stdout chunk (state=2): >>><<< 10074 1726773064.54912: _low_level_execute_command() done: rc=0, stdout=, stderr= 10074 1726773064.54917: _low_level_execute_command(): starting 10074 1726773064.54922: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773064.112697-10074-205341948065033/AnsiballZ_copy.py && sleep 0' 10074 1726773064.72022: stdout chunk (state=2): >>> {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773064.112697-10074-205341948065033/source", "md5sum": "db02fa7304c082dd9b84c2d712d07ec0", "checksum": "ba15904bb90578344fad097ce2f46f9231275eae", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 312, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773064.112697-10074-205341948065033/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "ba15904bb90578344fad097ce2f46f9231275eae", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 10074 1726773064.73241: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10074 1726773064.73289: stderr chunk (state=3): >>><<< 10074 1726773064.73302: stdout chunk (state=3): >>><<< 10074 1726773064.73314: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773064.112697-10074-205341948065033/source", "md5sum": "db02fa7304c082dd9b84c2d712d07ec0", "checksum": "ba15904bb90578344fad097ce2f46f9231275eae", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 312, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773064.112697-10074-205341948065033/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "ba15904bb90578344fad097ce2f46f9231275eae", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.47.99 closed. 10074 1726773064.73339: done with _execute_module (ansible.legacy.copy, {'src': '/root/.ansible/tmp/ansible-tmp-1726773064.112697-10074-205341948065033/source', 'dest': '/etc/tuned/kernel_settings/tuned.conf', 'mode': '0644', 'follow': False, '_original_basename': 'kernel_settings.j2', 'checksum': 'ba15904bb90578344fad097ce2f46f9231275eae', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773064.112697-10074-205341948065033/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10074 1726773064.73364: _low_level_execute_command(): starting 10074 1726773064.73371: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773064.112697-10074-205341948065033/ > /dev/null 2>&1 && sleep 0' 10074 1726773064.75863: stderr chunk (state=2): >>><<< 10074 1726773064.75873: stdout chunk (state=2): >>><<< 10074 1726773064.75889: _low_level_execute_command() done: rc=0, stdout=, stderr= 10074 1726773064.75900: handler run complete 10074 1726773064.75919: attempt loop complete, returning result 10074 1726773064.75923: _execute() done 10074 1726773064.75926: dumping result to json 10074 1726773064.75932: done dumping result, returning 10074 1726773064.75938: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings [0affffe7-6841-7dd6-8fa6-000000000059] 10074 1726773064.75944: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000059 10074 1726773064.75992: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000059 10074 1726773064.75996: WORKER PROCESS EXITING changed: [managed_node3] => { "changed": true, "checksum": "ba15904bb90578344fad097ce2f46f9231275eae", "dest": "/etc/tuned/kernel_settings/tuned.conf", "gid": 0, "group": "root", "md5sum": "db02fa7304c082dd9b84c2d712d07ec0", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 312, "src": "/root/.ansible/tmp/ansible-tmp-1726773064.112697-10074-205341948065033/source", "state": "file", "uid": 0 } 9733 1726773064.76164: no more pending results, returning what we have 9733 1726773064.76167: results queue empty 9733 1726773064.76167: checking for any_errors_fatal 9733 1726773064.76175: done checking for any_errors_fatal 9733 1726773064.76176: checking for max_fail_percentage 9733 1726773064.76177: done checking for max_fail_percentage 9733 1726773064.76178: checking to see if all hosts have failed and the running result is not ok 9733 1726773064.76178: done checking to see if all hosts have failed 9733 1726773064.76179: getting the remaining hosts for this loop 9733 1726773064.76180: done getting the remaining hosts for this loop 9733 1726773064.76183: getting the next task for host managed_node3 9733 1726773064.76188: done getting next task for host managed_node3 9733 1726773064.76191: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 9733 1726773064.76193: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773064.76203: getting variables 9733 1726773064.76204: in VariableManager get_vars() 9733 1726773064.76233: Calling all_inventory to load vars for managed_node3 9733 1726773064.76236: Calling groups_inventory to load vars for managed_node3 9733 1726773064.76238: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773064.76246: Calling all_plugins_play to load vars for managed_node3 9733 1726773064.76248: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773064.76250: Calling groups_plugins_play to load vars for managed_node3 9733 1726773064.76367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773064.76487: done with get_vars() 9733 1726773064.76496: done getting variables 9733 1726773064.76537: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 Thursday 19 September 2024 15:11:04 -0400 (0:00:00.693) 0:00:10.498 **** 9733 1726773064.76557: entering _queue_task() for managed_node3/service 9733 1726773064.76724: worker is 1 (out of 1 available) 9733 1726773064.76738: exiting _queue_task() for managed_node3/service 9733 1726773064.76750: done queuing things up, now waiting for results queue to drain 9733 1726773064.76752: waiting for pending results... 10104 1726773064.76863: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 10104 1726773064.76970: in run() - task 0affffe7-6841-7dd6-8fa6-00000000005a 10104 1726773064.76987: variable 'ansible_search_path' from source: unknown 10104 1726773064.76992: variable 'ansible_search_path' from source: unknown 10104 1726773064.77026: variable '__kernel_settings_services' from source: include_vars 10104 1726773064.77319: variable '__kernel_settings_services' from source: include_vars 10104 1726773064.77374: variable 'omit' from source: magic vars 10104 1726773064.77440: variable 'ansible_host' from source: host vars for 'managed_node3' 10104 1726773064.77447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10104 1726773064.77453: variable 'omit' from source: magic vars 10104 1726773064.77632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10104 1726773064.77800: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10104 1726773064.77832: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10104 1726773064.77859: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10104 1726773064.77888: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10104 1726773064.77960: variable '__kernel_settings_register_profile' from source: set_fact 10104 1726773064.77972: variable '__kernel_settings_register_mode' from source: set_fact 10104 1726773064.77990: Evaluated conditional (__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed): True 10104 1726773064.77996: variable 'omit' from source: magic vars 10104 1726773064.78024: variable 'omit' from source: magic vars 10104 1726773064.78055: variable 'item' from source: unknown 10104 1726773064.78105: variable 'item' from source: unknown 10104 1726773064.78121: variable 'omit' from source: magic vars 10104 1726773064.78140: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10104 1726773064.78161: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10104 1726773064.78178: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10104 1726773064.78194: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10104 1726773064.78205: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10104 1726773064.78227: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10104 1726773064.78232: variable 'ansible_host' from source: host vars for 'managed_node3' 10104 1726773064.78236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10104 1726773064.78305: Set connection var ansible_timeout to 10 10104 1726773064.78310: Set connection var ansible_shell_type to sh 10104 1726773064.78316: Set connection var ansible_module_compression to ZIP_DEFLATED 10104 1726773064.78321: Set connection var ansible_shell_executable to /bin/sh 10104 1726773064.78327: Set connection var ansible_pipelining to False 10104 1726773064.78333: Set connection var ansible_connection to ssh 10104 1726773064.78347: variable 'ansible_shell_executable' from source: unknown 10104 1726773064.78350: variable 'ansible_connection' from source: unknown 10104 1726773064.78354: variable 'ansible_module_compression' from source: unknown 10104 1726773064.78357: variable 'ansible_shell_type' from source: unknown 10104 1726773064.78360: variable 'ansible_shell_executable' from source: unknown 10104 1726773064.78364: variable 'ansible_host' from source: host vars for 'managed_node3' 10104 1726773064.78368: variable 'ansible_pipelining' from source: unknown 10104 1726773064.78373: variable 'ansible_timeout' from source: unknown 10104 1726773064.78378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10104 1726773064.78446: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10104 1726773064.78457: variable 'omit' from source: magic vars 10104 1726773064.78464: starting attempt loop 10104 1726773064.78467: running the handler 10104 1726773064.78526: variable 'ansible_facts' from source: unknown 10104 1726773064.78614: _low_level_execute_command(): starting 10104 1726773064.78624: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10104 1726773064.81014: stdout chunk (state=2): >>>/root <<< 10104 1726773064.81135: stderr chunk (state=3): >>><<< 10104 1726773064.81142: stdout chunk (state=3): >>><<< 10104 1726773064.81162: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10104 1726773064.81176: _low_level_execute_command(): starting 10104 1726773064.81183: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773064.811699-10104-106530428383857 `" && echo ansible-tmp-1726773064.811699-10104-106530428383857="` echo /root/.ansible/tmp/ansible-tmp-1726773064.811699-10104-106530428383857 `" ) && sleep 0' 10104 1726773064.83707: stdout chunk (state=2): >>>ansible-tmp-1726773064.811699-10104-106530428383857=/root/.ansible/tmp/ansible-tmp-1726773064.811699-10104-106530428383857 <<< 10104 1726773064.83842: stderr chunk (state=3): >>><<< 10104 1726773064.83849: stdout chunk (state=3): >>><<< 10104 1726773064.83867: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773064.811699-10104-106530428383857=/root/.ansible/tmp/ansible-tmp-1726773064.811699-10104-106530428383857 , stderr= 10104 1726773064.83894: variable 'ansible_module_compression' from source: unknown 10104 1726773064.83933: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 10104 1726773064.83987: variable 'ansible_facts' from source: unknown 10104 1726773064.84140: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773064.811699-10104-106530428383857/AnsiballZ_systemd.py 10104 1726773064.84251: Sending initial data 10104 1726773064.84259: Sent initial data (154 bytes) 10104 1726773064.86833: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpknidxp7n /root/.ansible/tmp/ansible-tmp-1726773064.811699-10104-106530428383857/AnsiballZ_systemd.py <<< 10104 1726773064.88947: stderr chunk (state=3): >>><<< 10104 1726773064.88955: stdout chunk (state=3): >>><<< 10104 1726773064.88976: done transferring module to remote 10104 1726773064.88989: _low_level_execute_command(): starting 10104 1726773064.88995: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773064.811699-10104-106530428383857/ /root/.ansible/tmp/ansible-tmp-1726773064.811699-10104-106530428383857/AnsiballZ_systemd.py && sleep 0' 10104 1726773064.91482: stderr chunk (state=2): >>><<< 10104 1726773064.91493: stdout chunk (state=2): >>><<< 10104 1726773064.91507: _low_level_execute_command() done: rc=0, stdout=, stderr= 10104 1726773064.91511: _low_level_execute_command(): starting 10104 1726773064.91517: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773064.811699-10104-106530428383857/AnsiballZ_systemd.py && sleep 0' 10104 1726773065.44305: stdout chunk (state=2): >>> {"name": "tuned", "changed": true, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:38 EDT", "WatchdogTimestampMonotonic": "453344532", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "9802", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ExecMainStartTimestampMonotonic": "453204995", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "9802", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:38 EDT] ; stop_time=[n/a] ; pid=9802 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "17055744", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "power-profiles-daemon.service auto-cpufreq.service tlp.service cpupower.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "systemd-journald.socket polkit.service systemd-sysctl.service dbus.socket basic.target system.slice network.target dbus.service sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:38 EDT", "StateChangeTimestampMonotonic": "453344536", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:38 EDT", "InactiveExitTimestampMonotonic": "453205057", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ActiveEnterTimestampMonotonic": "453344536", "ActiveExitTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ActiveExitTimestampMonotonic": "453097312", "InactiveEnterTimestamp": "Thu 2024-09-19 15:10:38 EDT", "InactiveEnterTimestampMonotonic": "453201635", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ConditionTimestampMonotonic": "453202685", "AssertTimestamp": "Thu 2024-09-19 15:10:38 EDT", "AssertTimestampMonotonic": "453202686", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "29d42365ee9e42d1916b9ebf15b9284e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "restarted", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 10104 1726773065.46159: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10104 1726773065.46174: stdout chunk (state=3): >>><<< 10104 1726773065.46189: stderr chunk (state=3): >>><<< 10104 1726773065.46211: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": true, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:38 EDT", "WatchdogTimestampMonotonic": "453344532", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "9802", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ExecMainStartTimestampMonotonic": "453204995", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "9802", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:38 EDT] ; stop_time=[n/a] ; pid=9802 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "17055744", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "power-profiles-daemon.service auto-cpufreq.service tlp.service cpupower.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "systemd-journald.socket polkit.service systemd-sysctl.service dbus.socket basic.target system.slice network.target dbus.service sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:38 EDT", "StateChangeTimestampMonotonic": "453344536", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:38 EDT", "InactiveExitTimestampMonotonic": "453205057", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ActiveEnterTimestampMonotonic": "453344536", "ActiveExitTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ActiveExitTimestampMonotonic": "453097312", "InactiveEnterTimestamp": "Thu 2024-09-19 15:10:38 EDT", "InactiveEnterTimestampMonotonic": "453201635", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ConditionTimestampMonotonic": "453202685", "AssertTimestamp": "Thu 2024-09-19 15:10:38 EDT", "AssertTimestampMonotonic": "453202686", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "29d42365ee9e42d1916b9ebf15b9284e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "restarted", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.47.99 closed. 10104 1726773065.46387: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'restarted', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773064.811699-10104-106530428383857/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10104 1726773065.46411: _low_level_execute_command(): starting 10104 1726773065.46419: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773064.811699-10104-106530428383857/ > /dev/null 2>&1 && sleep 0' 10104 1726773065.49233: stderr chunk (state=2): >>><<< 10104 1726773065.49246: stdout chunk (state=2): >>><<< 10104 1726773065.49264: _low_level_execute_command() done: rc=0, stdout=, stderr= 10104 1726773065.49276: handler run complete 10104 1726773065.49330: attempt loop complete, returning result 10104 1726773065.49352: variable 'item' from source: unknown 10104 1726773065.49436: variable 'item' from source: unknown changed: [managed_node3] => (item=tuned) => { "ansible_loop_var": "item", "changed": true, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ActiveEnterTimestampMonotonic": "453344536", "ActiveExitTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ActiveExitTimestampMonotonic": "453097312", "ActiveState": "active", "After": "systemd-journald.socket polkit.service systemd-sysctl.service dbus.socket basic.target system.slice network.target dbus.service sysinit.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 15:10:38 EDT", "AssertTimestampMonotonic": "453202686", "Before": "shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ConditionTimestampMonotonic": "453202685", "ConfigurationDirectoryMode": "0755", "Conflicts": "power-profiles-daemon.service auto-cpufreq.service tlp.service cpupower.service shutdown.target", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "9802", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ExecMainStartTimestampMonotonic": "453204995", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:38 EDT] ; stop_time=[n/a] ; pid=9802 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestamp": "Thu 2024-09-19 15:10:38 EDT", "InactiveEnterTimestampMonotonic": "453201635", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:38 EDT", "InactiveExitTimestampMonotonic": "453205057", "InvocationID": "29d42365ee9e42d1916b9ebf15b9284e", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "9802", "MemoryAccounting": "yes", "MemoryCurrent": "17055744", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 15:10:38 EDT", "StateChangeTimestampMonotonic": "453344536", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 15:10:38 EDT", "WatchdogTimestampMonotonic": "453344532", "WatchdogUSec": "0" } } 10104 1726773065.49563: dumping result to json 10104 1726773065.49576: done dumping result, returning 10104 1726773065.49583: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes [0affffe7-6841-7dd6-8fa6-00000000005a] 10104 1726773065.49589: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000005a 10104 1726773065.49682: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000005a 10104 1726773065.49689: WORKER PROCESS EXITING 9733 1726773065.50800: no more pending results, returning what we have 9733 1726773065.50804: results queue empty 9733 1726773065.50804: checking for any_errors_fatal 9733 1726773065.50813: done checking for any_errors_fatal 9733 1726773065.50814: checking for max_fail_percentage 9733 1726773065.50815: done checking for max_fail_percentage 9733 1726773065.50815: checking to see if all hosts have failed and the running result is not ok 9733 1726773065.50816: done checking to see if all hosts have failed 9733 1726773065.50817: getting the remaining hosts for this loop 9733 1726773065.50818: done getting the remaining hosts for this loop 9733 1726773065.50820: getting the next task for host managed_node3 9733 1726773065.50826: done getting next task for host managed_node3 9733 1726773065.50829: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 9733 1726773065.50831: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773065.50840: getting variables 9733 1726773065.50842: in VariableManager get_vars() 9733 1726773065.50868: Calling all_inventory to load vars for managed_node3 9733 1726773065.50870: Calling groups_inventory to load vars for managed_node3 9733 1726773065.50873: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773065.50883: Calling all_plugins_play to load vars for managed_node3 9733 1726773065.50887: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773065.50890: Calling groups_plugins_play to load vars for managed_node3 9733 1726773065.51047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773065.51269: done with get_vars() 9733 1726773065.51280: done getting variables 9733 1726773065.51372: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Tuned apply settings] ******** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 Thursday 19 September 2024 15:11:05 -0400 (0:00:00.748) 0:00:11.246 **** 9733 1726773065.51404: entering _queue_task() for managed_node3/command 9733 1726773065.51405: Creating lock for command 9733 1726773065.51653: worker is 1 (out of 1 available) 9733 1726773065.51666: exiting _queue_task() for managed_node3/command 9733 1726773065.51678: done queuing things up, now waiting for results queue to drain 9733 1726773065.51680: waiting for pending results... 10137 1726773065.51905: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 10137 1726773065.52040: in run() - task 0affffe7-6841-7dd6-8fa6-00000000005b 10137 1726773065.52057: variable 'ansible_search_path' from source: unknown 10137 1726773065.52062: variable 'ansible_search_path' from source: unknown 10137 1726773065.52098: calling self._execute() 10137 1726773065.52176: variable 'ansible_host' from source: host vars for 'managed_node3' 10137 1726773065.52187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10137 1726773065.52196: variable 'omit' from source: magic vars 10137 1726773065.52640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10137 1726773065.52900: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10137 1726773065.52947: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10137 1726773065.52980: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10137 1726773065.53016: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10137 1726773065.53133: variable '__kernel_settings_register_profile' from source: set_fact 10137 1726773065.53161: Evaluated conditional (not __kernel_settings_register_profile is changed): False 10137 1726773065.53167: when evaluation is False, skipping this task 10137 1726773065.53170: _execute() done 10137 1726773065.53173: dumping result to json 10137 1726773065.53176: done dumping result, returning 10137 1726773065.53183: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings [0affffe7-6841-7dd6-8fa6-00000000005b] 10137 1726773065.53190: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000005b 10137 1726773065.53221: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000005b 10137 1726773065.53224: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __kernel_settings_register_profile is changed", "skip_reason": "Conditional result was False" } 9733 1726773065.53630: no more pending results, returning what we have 9733 1726773065.53633: results queue empty 9733 1726773065.53633: checking for any_errors_fatal 9733 1726773065.53649: done checking for any_errors_fatal 9733 1726773065.53650: checking for max_fail_percentage 9733 1726773065.53651: done checking for max_fail_percentage 9733 1726773065.53652: checking to see if all hosts have failed and the running result is not ok 9733 1726773065.53652: done checking to see if all hosts have failed 9733 1726773065.53653: getting the remaining hosts for this loop 9733 1726773065.53654: done getting the remaining hosts for this loop 9733 1726773065.53657: getting the next task for host managed_node3 9733 1726773065.53664: done getting next task for host managed_node3 9733 1726773065.53667: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings 9733 1726773065.53672: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773065.53688: getting variables 9733 1726773065.53689: in VariableManager get_vars() 9733 1726773065.53721: Calling all_inventory to load vars for managed_node3 9733 1726773065.53724: Calling groups_inventory to load vars for managed_node3 9733 1726773065.53726: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773065.53734: Calling all_plugins_play to load vars for managed_node3 9733 1726773065.53737: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773065.53739: Calling groups_plugins_play to load vars for managed_node3 9733 1726773065.53909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773065.54111: done with get_vars() 9733 1726773065.54122: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Verify settings] ************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:166 Thursday 19 September 2024 15:11:05 -0400 (0:00:00.028) 0:00:11.274 **** 9733 1726773065.54216: entering _queue_task() for managed_node3/include_tasks 9733 1726773065.54446: worker is 1 (out of 1 available) 9733 1726773065.54461: exiting _queue_task() for managed_node3/include_tasks 9733 1726773065.54475: done queuing things up, now waiting for results queue to drain 9733 1726773065.54477: waiting for pending results... 10138 1726773065.55208: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Verify settings 10138 1726773065.55347: in run() - task 0affffe7-6841-7dd6-8fa6-00000000005c 10138 1726773065.55366: variable 'ansible_search_path' from source: unknown 10138 1726773065.55374: variable 'ansible_search_path' from source: unknown 10138 1726773065.55410: calling self._execute() 10138 1726773065.55495: variable 'ansible_host' from source: host vars for 'managed_node3' 10138 1726773065.55505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10138 1726773065.55513: variable 'omit' from source: magic vars 10138 1726773065.55966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10138 1726773065.56266: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10138 1726773065.56396: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10138 1726773065.56430: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10138 1726773065.56462: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10138 1726773065.56567: variable '__kernel_settings_register_apply' from source: set_fact 10138 1726773065.56597: Evaluated conditional (__kernel_settings_register_apply is changed): True 10138 1726773065.56605: _execute() done 10138 1726773065.56610: dumping result to json 10138 1726773065.56613: done dumping result, returning 10138 1726773065.56619: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Verify settings [0affffe7-6841-7dd6-8fa6-00000000005c] 10138 1726773065.56625: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000005c 10138 1726773065.56655: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000005c 10138 1726773065.56659: WORKER PROCESS EXITING 9733 1726773065.57129: no more pending results, returning what we have 9733 1726773065.57133: in VariableManager get_vars() 9733 1726773065.57174: Calling all_inventory to load vars for managed_node3 9733 1726773065.57177: Calling groups_inventory to load vars for managed_node3 9733 1726773065.57179: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773065.57191: Calling all_plugins_play to load vars for managed_node3 9733 1726773065.57193: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773065.57196: Calling groups_plugins_play to load vars for managed_node3 9733 1726773065.57367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773065.57624: done with get_vars() 9733 1726773065.57632: variable 'ansible_search_path' from source: unknown 9733 1726773065.57633: variable 'ansible_search_path' from source: unknown 9733 1726773065.57665: we have included files to process 9733 1726773065.57666: generating all_blocks data 9733 1726773065.57672: done generating all_blocks data 9733 1726773065.57677: processing included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 9733 1726773065.57678: loading included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 9733 1726773065.57681: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml included: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml for managed_node3 9733 1726773065.58162: done processing included file 9733 1726773065.58165: iterating over new_blocks loaded from include file 9733 1726773065.58166: in VariableManager get_vars() 9733 1726773065.58202: done with get_vars() 9733 1726773065.58204: filtering new block on tags 9733 1726773065.58253: done filtering new block on tags 9733 1726773065.58255: done iterating over new_blocks loaded from include file 9733 1726773065.58256: extending task lists for all hosts with included blocks 9733 1726773065.58889: done extending task lists 9733 1726773065.58890: done processing included files 9733 1726773065.58891: results queue empty 9733 1726773065.58892: checking for any_errors_fatal 9733 1726773065.58895: done checking for any_errors_fatal 9733 1726773065.58896: checking for max_fail_percentage 9733 1726773065.58897: done checking for max_fail_percentage 9733 1726773065.58897: checking to see if all hosts have failed and the running result is not ok 9733 1726773065.58898: done checking to see if all hosts have failed 9733 1726773065.58899: getting the remaining hosts for this loop 9733 1726773065.58899: done getting the remaining hosts for this loop 9733 1726773065.58902: getting the next task for host managed_node3 9733 1726773065.58905: done getting next task for host managed_node3 9733 1726773065.58908: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 9733 1726773065.58910: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773065.58919: getting variables 9733 1726773065.58920: in VariableManager get_vars() 9733 1726773065.58933: Calling all_inventory to load vars for managed_node3 9733 1726773065.58935: Calling groups_inventory to load vars for managed_node3 9733 1726773065.58937: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773065.58941: Calling all_plugins_play to load vars for managed_node3 9733 1726773065.58944: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773065.58946: Calling groups_plugins_play to load vars for managed_node3 9733 1726773065.59121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773065.59323: done with get_vars() 9733 1726773065.59332: done getting variables 9733 1726773065.59372: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:2 Thursday 19 September 2024 15:11:05 -0400 (0:00:00.051) 0:00:11.326 **** 9733 1726773065.59406: entering _queue_task() for managed_node3/command 9733 1726773065.59663: worker is 1 (out of 1 available) 9733 1726773065.59680: exiting _queue_task() for managed_node3/command 9733 1726773065.59693: done queuing things up, now waiting for results queue to drain 9733 1726773065.59695: waiting for pending results... 10139 1726773065.60000: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 10139 1726773065.60149: in run() - task 0affffe7-6841-7dd6-8fa6-00000000012b 10139 1726773065.60167: variable 'ansible_search_path' from source: unknown 10139 1726773065.60175: variable 'ansible_search_path' from source: unknown 10139 1726773065.60209: calling self._execute() 10139 1726773065.60288: variable 'ansible_host' from source: host vars for 'managed_node3' 10139 1726773065.60297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10139 1726773065.60305: variable 'omit' from source: magic vars 10139 1726773065.60402: variable 'omit' from source: magic vars 10139 1726773065.60461: variable 'omit' from source: magic vars 10139 1726773065.60496: variable 'omit' from source: magic vars 10139 1726773065.60538: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10139 1726773065.60576: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10139 1726773065.60600: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10139 1726773065.60619: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10139 1726773065.60631: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10139 1726773065.60660: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10139 1726773065.60665: variable 'ansible_host' from source: host vars for 'managed_node3' 10139 1726773065.60673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10139 1726773065.60776: Set connection var ansible_timeout to 10 10139 1726773065.60782: Set connection var ansible_shell_type to sh 10139 1726773065.60789: Set connection var ansible_module_compression to ZIP_DEFLATED 10139 1726773065.60795: Set connection var ansible_shell_executable to /bin/sh 10139 1726773065.60800: Set connection var ansible_pipelining to False 10139 1726773065.60807: Set connection var ansible_connection to ssh 10139 1726773065.60826: variable 'ansible_shell_executable' from source: unknown 10139 1726773065.60831: variable 'ansible_connection' from source: unknown 10139 1726773065.60834: variable 'ansible_module_compression' from source: unknown 10139 1726773065.60837: variable 'ansible_shell_type' from source: unknown 10139 1726773065.60840: variable 'ansible_shell_executable' from source: unknown 10139 1726773065.60843: variable 'ansible_host' from source: host vars for 'managed_node3' 10139 1726773065.60847: variable 'ansible_pipelining' from source: unknown 10139 1726773065.60850: variable 'ansible_timeout' from source: unknown 10139 1726773065.60853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10139 1726773065.61082: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10139 1726773065.61097: variable 'omit' from source: magic vars 10139 1726773065.61104: starting attempt loop 10139 1726773065.61108: running the handler 10139 1726773065.61123: _low_level_execute_command(): starting 10139 1726773065.61132: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10139 1726773065.63805: stdout chunk (state=2): >>>/root <<< 10139 1726773065.63952: stderr chunk (state=3): >>><<< 10139 1726773065.63961: stdout chunk (state=3): >>><<< 10139 1726773065.63988: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10139 1726773065.64004: _low_level_execute_command(): starting 10139 1726773065.64012: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773065.6399765-10139-124482947222177 `" && echo ansible-tmp-1726773065.6399765-10139-124482947222177="` echo /root/.ansible/tmp/ansible-tmp-1726773065.6399765-10139-124482947222177 `" ) && sleep 0' 10139 1726773065.67071: stdout chunk (state=2): >>>ansible-tmp-1726773065.6399765-10139-124482947222177=/root/.ansible/tmp/ansible-tmp-1726773065.6399765-10139-124482947222177 <<< 10139 1726773065.67233: stderr chunk (state=3): >>><<< 10139 1726773065.67242: stdout chunk (state=3): >>><<< 10139 1726773065.67260: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773065.6399765-10139-124482947222177=/root/.ansible/tmp/ansible-tmp-1726773065.6399765-10139-124482947222177 , stderr= 10139 1726773065.67296: variable 'ansible_module_compression' from source: unknown 10139 1726773065.67353: ANSIBALLZ: Using generic lock for ansible.legacy.command 10139 1726773065.67358: ANSIBALLZ: Acquiring lock 10139 1726773065.67361: ANSIBALLZ: Lock acquired: 139792132305312 10139 1726773065.67364: ANSIBALLZ: Creating module 10139 1726773065.81120: ANSIBALLZ: Writing module into payload 10139 1726773065.81240: ANSIBALLZ: Writing module 10139 1726773065.81264: ANSIBALLZ: Renaming module 10139 1726773065.81273: ANSIBALLZ: Done creating module 10139 1726773065.81292: variable 'ansible_facts' from source: unknown 10139 1726773065.81382: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773065.6399765-10139-124482947222177/AnsiballZ_command.py 10139 1726773065.81882: Sending initial data 10139 1726773065.81892: Sent initial data (155 bytes) 10139 1726773065.85011: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmp6g8k6hze /root/.ansible/tmp/ansible-tmp-1726773065.6399765-10139-124482947222177/AnsiballZ_command.py <<< 10139 1726773065.87131: stderr chunk (state=3): >>><<< 10139 1726773065.87145: stdout chunk (state=3): >>><<< 10139 1726773065.87169: done transferring module to remote 10139 1726773065.87183: _low_level_execute_command(): starting 10139 1726773065.87193: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773065.6399765-10139-124482947222177/ /root/.ansible/tmp/ansible-tmp-1726773065.6399765-10139-124482947222177/AnsiballZ_command.py && sleep 0' 10139 1726773065.90393: stderr chunk (state=2): >>><<< 10139 1726773065.90406: stdout chunk (state=2): >>><<< 10139 1726773065.90425: _low_level_execute_command() done: rc=0, stdout=, stderr= 10139 1726773065.90430: _low_level_execute_command(): starting 10139 1726773065.90436: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773065.6399765-10139-124482947222177/AnsiballZ_command.py && sleep 0' 10139 1726773066.16589: stdout chunk (state=2): >>> {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 15:11:06.058419", "end": "2024-09-19 15:11:06.163889", "delta": "0:00:00.105470", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10139 1726773066.17907: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10139 1726773066.17919: stdout chunk (state=3): >>><<< 10139 1726773066.17931: stderr chunk (state=3): >>><<< 10139 1726773066.17948: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 15:11:06.058419", "end": "2024-09-19 15:11:06.163889", "delta": "0:00:00.105470", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.47.99 closed. 10139 1726773066.18001: done with _execute_module (ansible.legacy.command, {'_raw_params': 'tuned-adm verify -i', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773065.6399765-10139-124482947222177/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10139 1726773066.18013: _low_level_execute_command(): starting 10139 1726773066.18019: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773065.6399765-10139-124482947222177/ > /dev/null 2>&1 && sleep 0' 10139 1726773066.20793: stderr chunk (state=2): >>><<< 10139 1726773066.20806: stdout chunk (state=2): >>><<< 10139 1726773066.20824: _low_level_execute_command() done: rc=0, stdout=, stderr= 10139 1726773066.20833: handler run complete 10139 1726773066.20857: Evaluated conditional (False): False 10139 1726773066.20873: attempt loop complete, returning result 10139 1726773066.20879: _execute() done 10139 1726773066.20885: dumping result to json 10139 1726773066.20891: done dumping result, returning 10139 1726773066.20901: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly [0affffe7-6841-7dd6-8fa6-00000000012b] 10139 1726773066.20908: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000012b 10139 1726773066.20948: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000012b 10139 1726773066.20952: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "tuned-adm", "verify", "-i" ], "delta": "0:00:00.105470", "end": "2024-09-19 15:11:06.163889", "rc": 0, "start": "2024-09-19 15:11:06.058419" } STDOUT: Verification succeeded, current system settings match the preset profile. See TuneD log file ('/var/log/tuned/tuned.log') for details. 9733 1726773066.21474: no more pending results, returning what we have 9733 1726773066.21477: results queue empty 9733 1726773066.21478: checking for any_errors_fatal 9733 1726773066.21480: done checking for any_errors_fatal 9733 1726773066.21480: checking for max_fail_percentage 9733 1726773066.21482: done checking for max_fail_percentage 9733 1726773066.21482: checking to see if all hosts have failed and the running result is not ok 9733 1726773066.21483: done checking to see if all hosts have failed 9733 1726773066.21483: getting the remaining hosts for this loop 9733 1726773066.21486: done getting the remaining hosts for this loop 9733 1726773066.21490: getting the next task for host managed_node3 9733 1726773066.21495: done getting next task for host managed_node3 9733 1726773066.21498: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 9733 1726773066.21501: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773066.21512: getting variables 9733 1726773066.21513: in VariableManager get_vars() 9733 1726773066.21549: Calling all_inventory to load vars for managed_node3 9733 1726773066.21552: Calling groups_inventory to load vars for managed_node3 9733 1726773066.21554: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773066.21564: Calling all_plugins_play to load vars for managed_node3 9733 1726773066.21566: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773066.21572: Calling groups_plugins_play to load vars for managed_node3 9733 1726773066.21743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773066.21960: done with get_vars() 9733 1726773066.21975: done getting variables 9733 1726773066.22066: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Get last verify results from log] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:12 Thursday 19 September 2024 15:11:06 -0400 (0:00:00.626) 0:00:11.953 **** 9733 1726773066.22103: entering _queue_task() for managed_node3/shell 9733 1726773066.22105: Creating lock for shell 9733 1726773066.22359: worker is 1 (out of 1 available) 9733 1726773066.22377: exiting _queue_task() for managed_node3/shell 9733 1726773066.22390: done queuing things up, now waiting for results queue to drain 9733 1726773066.22393: waiting for pending results... 10192 1726773066.22710: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 10192 1726773066.22876: in run() - task 0affffe7-6841-7dd6-8fa6-00000000012c 10192 1726773066.22896: variable 'ansible_search_path' from source: unknown 10192 1726773066.22901: variable 'ansible_search_path' from source: unknown 10192 1726773066.22934: calling self._execute() 10192 1726773066.23016: variable 'ansible_host' from source: host vars for 'managed_node3' 10192 1726773066.23026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10192 1726773066.23035: variable 'omit' from source: magic vars 10192 1726773066.23575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10192 1726773066.23866: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10192 1726773066.23913: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10192 1726773066.23947: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10192 1726773066.23983: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10192 1726773066.24092: variable '__kernel_settings_register_verify_values' from source: set_fact 10192 1726773066.24118: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 10192 1726773066.24123: when evaluation is False, skipping this task 10192 1726773066.24126: _execute() done 10192 1726773066.24130: dumping result to json 10192 1726773066.24133: done dumping result, returning 10192 1726773066.24140: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log [0affffe7-6841-7dd6-8fa6-00000000012c] 10192 1726773066.24146: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000012c 10192 1726773066.24178: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000012c 10192 1726773066.24182: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 9733 1726773066.24578: no more pending results, returning what we have 9733 1726773066.24581: results queue empty 9733 1726773066.24581: checking for any_errors_fatal 9733 1726773066.24589: done checking for any_errors_fatal 9733 1726773066.24589: checking for max_fail_percentage 9733 1726773066.24590: done checking for max_fail_percentage 9733 1726773066.24591: checking to see if all hosts have failed and the running result is not ok 9733 1726773066.24591: done checking to see if all hosts have failed 9733 1726773066.24592: getting the remaining hosts for this loop 9733 1726773066.24592: done getting the remaining hosts for this loop 9733 1726773066.24595: getting the next task for host managed_node3 9733 1726773066.24599: done getting next task for host managed_node3 9733 1726773066.24602: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 9733 1726773066.24604: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773066.24613: getting variables 9733 1726773066.24614: in VariableManager get_vars() 9733 1726773066.24638: Calling all_inventory to load vars for managed_node3 9733 1726773066.24639: Calling groups_inventory to load vars for managed_node3 9733 1726773066.24642: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773066.24650: Calling all_plugins_play to load vars for managed_node3 9733 1726773066.24652: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773066.24654: Calling groups_plugins_play to load vars for managed_node3 9733 1726773066.24807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773066.24926: done with get_vars() 9733 1726773066.24934: done getting variables 9733 1726773066.24978: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:23 Thursday 19 September 2024 15:11:06 -0400 (0:00:00.029) 0:00:11.982 **** 9733 1726773066.25006: entering _queue_task() for managed_node3/fail 9733 1726773066.25211: worker is 1 (out of 1 available) 9733 1726773066.25225: exiting _queue_task() for managed_node3/fail 9733 1726773066.25236: done queuing things up, now waiting for results queue to drain 9733 1726773066.25239: waiting for pending results... 10193 1726773066.25495: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 10193 1726773066.25639: in run() - task 0affffe7-6841-7dd6-8fa6-00000000012d 10193 1726773066.25657: variable 'ansible_search_path' from source: unknown 10193 1726773066.25662: variable 'ansible_search_path' from source: unknown 10193 1726773066.25695: calling self._execute() 10193 1726773066.25773: variable 'ansible_host' from source: host vars for 'managed_node3' 10193 1726773066.25783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10193 1726773066.25794: variable 'omit' from source: magic vars 10193 1726773066.26224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10193 1726773066.26476: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10193 1726773066.26524: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10193 1726773066.26560: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10193 1726773066.26595: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10193 1726773066.26699: variable '__kernel_settings_register_verify_values' from source: set_fact 10193 1726773066.26725: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 10193 1726773066.26731: when evaluation is False, skipping this task 10193 1726773066.26734: _execute() done 10193 1726773066.26737: dumping result to json 10193 1726773066.26742: done dumping result, returning 10193 1726773066.26748: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors [0affffe7-6841-7dd6-8fa6-00000000012d] 10193 1726773066.26754: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000012d 10193 1726773066.26784: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000012d 10193 1726773066.26790: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 9733 1726773066.27165: no more pending results, returning what we have 9733 1726773066.27169: results queue empty 9733 1726773066.27170: checking for any_errors_fatal 9733 1726773066.27175: done checking for any_errors_fatal 9733 1726773066.27176: checking for max_fail_percentage 9733 1726773066.27177: done checking for max_fail_percentage 9733 1726773066.27178: checking to see if all hosts have failed and the running result is not ok 9733 1726773066.27178: done checking to see if all hosts have failed 9733 1726773066.27179: getting the remaining hosts for this loop 9733 1726773066.27180: done getting the remaining hosts for this loop 9733 1726773066.27183: getting the next task for host managed_node3 9733 1726773066.27191: done getting next task for host managed_node3 9733 1726773066.27195: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 9733 1726773066.27197: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773066.27210: getting variables 9733 1726773066.27211: in VariableManager get_vars() 9733 1726773066.27244: Calling all_inventory to load vars for managed_node3 9733 1726773066.27247: Calling groups_inventory to load vars for managed_node3 9733 1726773066.27249: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773066.27258: Calling all_plugins_play to load vars for managed_node3 9733 1726773066.27260: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773066.27263: Calling groups_plugins_play to load vars for managed_node3 9733 1726773066.27434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773066.27629: done with get_vars() 9733 1726773066.27640: done getting variables 9733 1726773066.27697: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:177 Thursday 19 September 2024 15:11:06 -0400 (0:00:00.027) 0:00:12.009 **** 9733 1726773066.27726: entering _queue_task() for managed_node3/set_fact 9733 1726773066.27962: worker is 1 (out of 1 available) 9733 1726773066.27978: exiting _queue_task() for managed_node3/set_fact 9733 1726773066.27993: done queuing things up, now waiting for results queue to drain 9733 1726773066.27995: waiting for pending results... 10194 1726773066.28228: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 10194 1726773066.28358: in run() - task 0affffe7-6841-7dd6-8fa6-00000000005d 10194 1726773066.28376: variable 'ansible_search_path' from source: unknown 10194 1726773066.28381: variable 'ansible_search_path' from source: unknown 10194 1726773066.28416: calling self._execute() 10194 1726773066.28497: variable 'ansible_host' from source: host vars for 'managed_node3' 10194 1726773066.28507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10194 1726773066.28515: variable 'omit' from source: magic vars 10194 1726773066.28611: variable 'omit' from source: magic vars 10194 1726773066.28658: variable 'omit' from source: magic vars 10194 1726773066.28690: variable 'omit' from source: magic vars 10194 1726773066.28732: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10194 1726773066.28767: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10194 1726773066.28790: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10194 1726773066.28810: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10194 1726773066.28822: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10194 1726773066.28851: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10194 1726773066.28856: variable 'ansible_host' from source: host vars for 'managed_node3' 10194 1726773066.28860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10194 1726773066.28959: Set connection var ansible_timeout to 10 10194 1726773066.28965: Set connection var ansible_shell_type to sh 10194 1726773066.28971: Set connection var ansible_module_compression to ZIP_DEFLATED 10194 1726773066.28976: Set connection var ansible_shell_executable to /bin/sh 10194 1726773066.28981: Set connection var ansible_pipelining to False 10194 1726773066.28990: Set connection var ansible_connection to ssh 10194 1726773066.29009: variable 'ansible_shell_executable' from source: unknown 10194 1726773066.29013: variable 'ansible_connection' from source: unknown 10194 1726773066.29016: variable 'ansible_module_compression' from source: unknown 10194 1726773066.29020: variable 'ansible_shell_type' from source: unknown 10194 1726773066.29023: variable 'ansible_shell_executable' from source: unknown 10194 1726773066.29026: variable 'ansible_host' from source: host vars for 'managed_node3' 10194 1726773066.29030: variable 'ansible_pipelining' from source: unknown 10194 1726773066.29033: variable 'ansible_timeout' from source: unknown 10194 1726773066.29036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10194 1726773066.29263: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10194 1726773066.29276: variable 'omit' from source: magic vars 10194 1726773066.29283: starting attempt loop 10194 1726773066.29288: running the handler 10194 1726773066.29300: handler run complete 10194 1726773066.29309: attempt loop complete, returning result 10194 1726773066.29313: _execute() done 10194 1726773066.29316: dumping result to json 10194 1726773066.29319: done dumping result, returning 10194 1726773066.29325: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes [0affffe7-6841-7dd6-8fa6-00000000005d] 10194 1726773066.29331: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000005d 10194 1726773066.29354: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000005d 10194 1726773066.29357: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "kernel_settings_reboot_required": false }, "changed": false } 9733 1726773066.29726: no more pending results, returning what we have 9733 1726773066.29729: results queue empty 9733 1726773066.29730: checking for any_errors_fatal 9733 1726773066.29735: done checking for any_errors_fatal 9733 1726773066.29736: checking for max_fail_percentage 9733 1726773066.29737: done checking for max_fail_percentage 9733 1726773066.29737: checking to see if all hosts have failed and the running result is not ok 9733 1726773066.29738: done checking to see if all hosts have failed 9733 1726773066.29738: getting the remaining hosts for this loop 9733 1726773066.29739: done getting the remaining hosts for this loop 9733 1726773066.29742: getting the next task for host managed_node3 9733 1726773066.29747: done getting next task for host managed_node3 9733 1726773066.29750: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 9733 1726773066.29753: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773066.29762: getting variables 9733 1726773066.29763: in VariableManager get_vars() 9733 1726773066.29796: Calling all_inventory to load vars for managed_node3 9733 1726773066.29799: Calling groups_inventory to load vars for managed_node3 9733 1726773066.29801: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773066.29809: Calling all_plugins_play to load vars for managed_node3 9733 1726773066.29812: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773066.29815: Calling groups_plugins_play to load vars for managed_node3 9733 1726773066.30022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773066.30215: done with get_vars() 9733 1726773066.30225: done getting variables 9733 1726773066.30280: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:181 Thursday 19 September 2024 15:11:06 -0400 (0:00:00.025) 0:00:12.035 **** 9733 1726773066.30310: entering _queue_task() for managed_node3/set_fact 9733 1726773066.30539: worker is 1 (out of 1 available) 9733 1726773066.30553: exiting _queue_task() for managed_node3/set_fact 9733 1726773066.30565: done queuing things up, now waiting for results queue to drain 9733 1726773066.30567: waiting for pending results... 10195 1726773066.30794: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 10195 1726773066.30923: in run() - task 0affffe7-6841-7dd6-8fa6-00000000005e 10195 1726773066.30942: variable 'ansible_search_path' from source: unknown 10195 1726773066.30947: variable 'ansible_search_path' from source: unknown 10195 1726773066.30981: calling self._execute() 10195 1726773066.31063: variable 'ansible_host' from source: host vars for 'managed_node3' 10195 1726773066.31072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10195 1726773066.31081: variable 'omit' from source: magic vars 10195 1726773066.31180: variable 'omit' from source: magic vars 10195 1726773066.31228: variable 'omit' from source: magic vars 10195 1726773066.31591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10195 1726773066.31887: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10195 1726773066.31932: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10195 1726773066.31968: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10195 1726773066.32003: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10195 1726773066.32133: variable '__kernel_settings_register_profile' from source: set_fact 10195 1726773066.32148: variable '__kernel_settings_register_mode' from source: set_fact 10195 1726773066.32156: variable '__kernel_settings_register_apply' from source: set_fact 10195 1726773066.32205: variable 'omit' from source: magic vars 10195 1726773066.32232: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10195 1726773066.32293: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10195 1726773066.32312: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10195 1726773066.32330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10195 1726773066.32341: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10195 1726773066.32370: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10195 1726773066.32376: variable 'ansible_host' from source: host vars for 'managed_node3' 10195 1726773066.32381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10195 1726773066.32482: Set connection var ansible_timeout to 10 10195 1726773066.32489: Set connection var ansible_shell_type to sh 10195 1726773066.32496: Set connection var ansible_module_compression to ZIP_DEFLATED 10195 1726773066.32501: Set connection var ansible_shell_executable to /bin/sh 10195 1726773066.32507: Set connection var ansible_pipelining to False 10195 1726773066.32514: Set connection var ansible_connection to ssh 10195 1726773066.32537: variable 'ansible_shell_executable' from source: unknown 10195 1726773066.32542: variable 'ansible_connection' from source: unknown 10195 1726773066.32545: variable 'ansible_module_compression' from source: unknown 10195 1726773066.32548: variable 'ansible_shell_type' from source: unknown 10195 1726773066.32551: variable 'ansible_shell_executable' from source: unknown 10195 1726773066.32554: variable 'ansible_host' from source: host vars for 'managed_node3' 10195 1726773066.32558: variable 'ansible_pipelining' from source: unknown 10195 1726773066.32561: variable 'ansible_timeout' from source: unknown 10195 1726773066.32565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10195 1726773066.32660: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10195 1726773066.32672: variable 'omit' from source: magic vars 10195 1726773066.32678: starting attempt loop 10195 1726773066.32682: running the handler 10195 1726773066.32694: handler run complete 10195 1726773066.32702: attempt loop complete, returning result 10195 1726773066.32705: _execute() done 10195 1726773066.32708: dumping result to json 10195 1726773066.32711: done dumping result, returning 10195 1726773066.32717: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing [0affffe7-6841-7dd6-8fa6-00000000005e] 10195 1726773066.32723: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000005e 10195 1726773066.32748: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000005e 10195 1726773066.32752: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__kernel_settings_changed": true }, "changed": false } 9733 1726773066.33005: no more pending results, returning what we have 9733 1726773066.33008: results queue empty 9733 1726773066.33009: checking for any_errors_fatal 9733 1726773066.33016: done checking for any_errors_fatal 9733 1726773066.33017: checking for max_fail_percentage 9733 1726773066.33019: done checking for max_fail_percentage 9733 1726773066.33019: checking to see if all hosts have failed and the running result is not ok 9733 1726773066.33020: done checking to see if all hosts have failed 9733 1726773066.33021: getting the remaining hosts for this loop 9733 1726773066.33022: done getting the remaining hosts for this loop 9733 1726773066.33026: getting the next task for host managed_node3 9733 1726773066.33035: done getting next task for host managed_node3 9733 1726773066.33037: ^ task is: TASK: meta (role_complete) 9733 1726773066.33039: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773066.33051: getting variables 9733 1726773066.33053: in VariableManager get_vars() 9733 1726773066.33090: Calling all_inventory to load vars for managed_node3 9733 1726773066.33094: Calling groups_inventory to load vars for managed_node3 9733 1726773066.33096: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773066.33108: Calling all_plugins_play to load vars for managed_node3 9733 1726773066.33111: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773066.33114: Calling groups_plugins_play to load vars for managed_node3 9733 1726773066.33294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773066.33495: done with get_vars() 9733 1726773066.33507: done getting variables 9733 1726773066.33586: done queuing things up, now waiting for results queue to drain 9733 1726773066.33589: results queue empty 9733 1726773066.33589: checking for any_errors_fatal 9733 1726773066.33594: done checking for any_errors_fatal 9733 1726773066.33594: checking for max_fail_percentage 9733 1726773066.33595: done checking for max_fail_percentage 9733 1726773066.33601: checking to see if all hosts have failed and the running result is not ok 9733 1726773066.33602: done checking to see if all hosts have failed 9733 1726773066.33602: getting the remaining hosts for this loop 9733 1726773066.33603: done getting the remaining hosts for this loop 9733 1726773066.33605: getting the next task for host managed_node3 9733 1726773066.33609: done getting next task for host managed_node3 9733 1726773066.33610: ^ task is: TASK: Verify that settings were applied correctly 9733 1726773066.33612: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773066.33614: getting variables 9733 1726773066.33615: in VariableManager get_vars() 9733 1726773066.33626: Calling all_inventory to load vars for managed_node3 9733 1726773066.33628: Calling groups_inventory to load vars for managed_node3 9733 1726773066.33630: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773066.33634: Calling all_plugins_play to load vars for managed_node3 9733 1726773066.33636: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773066.33638: Calling groups_plugins_play to load vars for managed_node3 9733 1726773066.33804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773066.34013: done with get_vars() 9733 1726773066.34022: done getting variables TASK [Verify that settings were applied correctly] ***************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_simple_settings.yml:30 Thursday 19 September 2024 15:11:06 -0400 (0:00:00.037) 0:00:12.073 **** 9733 1726773066.34094: entering _queue_task() for managed_node3/include_tasks 9733 1726773066.34569: worker is 1 (out of 1 available) 9733 1726773066.34581: exiting _queue_task() for managed_node3/include_tasks 9733 1726773066.34594: done queuing things up, now waiting for results queue to drain 9733 1726773066.34596: waiting for pending results... 10196 1726773066.34814: running TaskExecutor() for managed_node3/TASK: Verify that settings were applied correctly 10196 1726773066.34927: in run() - task 0affffe7-6841-7dd6-8fa6-000000000009 10196 1726773066.34946: variable 'ansible_search_path' from source: unknown 10196 1726773066.34982: calling self._execute() 10196 1726773066.35065: variable 'ansible_host' from source: host vars for 'managed_node3' 10196 1726773066.35075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10196 1726773066.35084: variable 'omit' from source: magic vars 10196 1726773066.35182: _execute() done 10196 1726773066.35189: dumping result to json 10196 1726773066.35194: done dumping result, returning 10196 1726773066.35199: done running TaskExecutor() for managed_node3/TASK: Verify that settings were applied correctly [0affffe7-6841-7dd6-8fa6-000000000009] 10196 1726773066.35207: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000009 10196 1726773066.35236: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000009 10196 1726773066.35240: WORKER PROCESS EXITING 9733 1726773066.35568: no more pending results, returning what we have 9733 1726773066.35572: in VariableManager get_vars() 9733 1726773066.35607: Calling all_inventory to load vars for managed_node3 9733 1726773066.35610: Calling groups_inventory to load vars for managed_node3 9733 1726773066.35612: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773066.35622: Calling all_plugins_play to load vars for managed_node3 9733 1726773066.35624: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773066.35627: Calling groups_plugins_play to load vars for managed_node3 9733 1726773066.35797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773066.35991: done with get_vars() 9733 1726773066.35999: variable 'ansible_search_path' from source: unknown 9733 1726773066.36013: we have included files to process 9733 1726773066.36013: generating all_blocks data 9733 1726773066.36015: done generating all_blocks data 9733 1726773066.36018: processing included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings.yml 9733 1726773066.36020: loading included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings.yml 9733 1726773066.36022: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings.yml 9733 1726773066.36837: in VariableManager get_vars() 9733 1726773066.36857: done with get_vars() included: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings.yml for managed_node3 9733 1726773066.37011: done processing included file 9733 1726773066.37013: iterating over new_blocks loaded from include file 9733 1726773066.37014: in VariableManager get_vars() 9733 1726773066.37028: done with get_vars() 9733 1726773066.37030: filtering new block on tags 9733 1726773066.37105: done filtering new block on tags 9733 1726773066.37108: done iterating over new_blocks loaded from include file 9733 1726773066.37109: extending task lists for all hosts with included blocks 9733 1726773066.38282: done extending task lists 9733 1726773066.38283: done processing included files 9733 1726773066.38284: results queue empty 9733 1726773066.38286: checking for any_errors_fatal 9733 1726773066.38288: done checking for any_errors_fatal 9733 1726773066.38288: checking for max_fail_percentage 9733 1726773066.38289: done checking for max_fail_percentage 9733 1726773066.38290: checking to see if all hosts have failed and the running result is not ok 9733 1726773066.38291: done checking to see if all hosts have failed 9733 1726773066.38291: getting the remaining hosts for this loop 9733 1726773066.38292: done getting the remaining hosts for this loop 9733 1726773066.38294: getting the next task for host managed_node3 9733 1726773066.38298: done getting next task for host managed_node3 9733 1726773066.38300: ^ task is: TASK: Set version specific variables 9733 1726773066.38302: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773066.38304: getting variables 9733 1726773066.38305: in VariableManager get_vars() 9733 1726773066.38318: Calling all_inventory to load vars for managed_node3 9733 1726773066.38320: Calling groups_inventory to load vars for managed_node3 9733 1726773066.38322: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773066.38328: Calling all_plugins_play to load vars for managed_node3 9733 1726773066.38330: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773066.38333: Calling groups_plugins_play to load vars for managed_node3 9733 1726773066.38494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773066.38676: done with get_vars() 9733 1726773066.38688: done getting variables 9733 1726773066.38726: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set version specific variables] ****************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings.yml:2 Thursday 19 September 2024 15:11:06 -0400 (0:00:00.046) 0:00:12.120 **** 9733 1726773066.38754: entering _queue_task() for managed_node3/include_vars 9733 1726773066.39031: worker is 1 (out of 1 available) 9733 1726773066.39046: exiting _queue_task() for managed_node3/include_vars 9733 1726773066.39058: done queuing things up, now waiting for results queue to drain 9733 1726773066.39059: waiting for pending results... 10197 1726773066.39288: running TaskExecutor() for managed_node3/TASK: Set version specific variables 10197 1726773066.39414: in run() - task 0affffe7-6841-7dd6-8fa6-000000000187 10197 1726773066.39434: variable 'ansible_search_path' from source: unknown 10197 1726773066.39439: variable 'ansible_search_path' from source: unknown 10197 1726773066.39471: calling self._execute() 10197 1726773066.39554: variable 'ansible_host' from source: host vars for 'managed_node3' 10197 1726773066.39563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10197 1726773066.39573: variable 'omit' from source: magic vars 10197 1726773066.39671: variable 'omit' from source: magic vars 10197 1726773066.39714: variable 'omit' from source: magic vars 10197 1726773066.40063: variable 'ffparams' from source: task vars 10197 1726773066.40167: variable 'ansible_distribution' from source: facts 10197 1726773066.40176: variable 'ansible_distribution_major_version' from source: facts 10197 1726773066.40329: variable 'ansible_distribution' from source: facts 10197 1726773066.40489: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 10197 1726773066.40971: Loaded config def from plugin (lookup/first_found) 10197 1726773066.40981: Loading LookupModule 'first_found' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/first_found.py 10197 1726773066.41018: variable 'ansible_search_path' from source: unknown 10197 1726773066.41026: evaluation_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings 10197 1726773066.41068: search_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/files/vars/tests_CentOS_8.yml /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/vars/tests_CentOS_8.yml /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/files/vars/tests_CentOS_8.yml /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/vars/tests_CentOS_8.yml /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/files/vars/tests_CentOS_8.yml /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/vars/tests_CentOS_8.yml 10197 1726773066.41092: variable 'ansible_search_path' from source: unknown 10197 1726773066.41097: evaluation_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings 10197 1726773066.41129: search_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/files/vars/tests_CentOS.yml /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/vars/tests_CentOS.yml /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/files/vars/tests_CentOS.yml /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/vars/tests_CentOS.yml /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/files/vars/tests_CentOS.yml /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/vars/tests_CentOS.yml 10197 1726773066.41145: variable 'ansible_search_path' from source: unknown 10197 1726773066.41149: evaluation_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings 10197 1726773066.41177: search_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/files/vars/tests_default.yml /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/vars/tests_default.yml /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/files/vars/tests_default.yml /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/vars/tests_default.yml /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/files/vars/tests_default.yml /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/vars/tests_default.yml 10197 1726773066.41199: variable 'omit' from source: magic vars 10197 1726773066.41223: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10197 1726773066.41246: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10197 1726773066.41264: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10197 1726773066.41280: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10197 1726773066.41303: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10197 1726773066.41330: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10197 1726773066.41335: variable 'ansible_host' from source: host vars for 'managed_node3' 10197 1726773066.41339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10197 1726773066.41439: Set connection var ansible_timeout to 10 10197 1726773066.41444: Set connection var ansible_shell_type to sh 10197 1726773066.41450: Set connection var ansible_module_compression to ZIP_DEFLATED 10197 1726773066.41456: Set connection var ansible_shell_executable to /bin/sh 10197 1726773066.41461: Set connection var ansible_pipelining to False 10197 1726773066.41468: Set connection var ansible_connection to ssh 10197 1726773066.41491: variable 'ansible_shell_executable' from source: unknown 10197 1726773066.41497: variable 'ansible_connection' from source: unknown 10197 1726773066.41500: variable 'ansible_module_compression' from source: unknown 10197 1726773066.41503: variable 'ansible_shell_type' from source: unknown 10197 1726773066.41506: variable 'ansible_shell_executable' from source: unknown 10197 1726773066.41509: variable 'ansible_host' from source: host vars for 'managed_node3' 10197 1726773066.41512: variable 'ansible_pipelining' from source: unknown 10197 1726773066.41515: variable 'ansible_timeout' from source: unknown 10197 1726773066.41519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10197 1726773066.41615: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10197 1726773066.41627: variable 'omit' from source: magic vars 10197 1726773066.41633: starting attempt loop 10197 1726773066.41637: running the handler 10197 1726773066.41690: handler run complete 10197 1726773066.41700: attempt loop complete, returning result 10197 1726773066.41704: _execute() done 10197 1726773066.41708: dumping result to json 10197 1726773066.41712: done dumping result, returning 10197 1726773066.41718: done running TaskExecutor() for managed_node3/TASK: Set version specific variables [0affffe7-6841-7dd6-8fa6-000000000187] 10197 1726773066.41723: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000187 10197 1726773066.41753: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000187 10197 1726773066.41757: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__kernel_settings_test_python_cmd": "/usr/libexec/platform-python", "__kernel_settings_test_python_pkgs": [ "python3-configobj" ] }, "ansible_included_var_files": [ "/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/vars/tests_default.yml" ], "changed": false } 9733 1726773066.42063: no more pending results, returning what we have 9733 1726773066.42067: results queue empty 9733 1726773066.42067: checking for any_errors_fatal 9733 1726773066.42069: done checking for any_errors_fatal 9733 1726773066.42069: checking for max_fail_percentage 9733 1726773066.42070: done checking for max_fail_percentage 9733 1726773066.42071: checking to see if all hosts have failed and the running result is not ok 9733 1726773066.42072: done checking to see if all hosts have failed 9733 1726773066.42072: getting the remaining hosts for this loop 9733 1726773066.42073: done getting the remaining hosts for this loop 9733 1726773066.42076: getting the next task for host managed_node3 9733 1726773066.42082: done getting next task for host managed_node3 9733 1726773066.42086: ^ task is: TASK: Reset settings success flag 9733 1726773066.42089: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773066.42092: getting variables 9733 1726773066.42093: in VariableManager get_vars() 9733 1726773066.42125: Calling all_inventory to load vars for managed_node3 9733 1726773066.42127: Calling groups_inventory to load vars for managed_node3 9733 1726773066.42130: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773066.42139: Calling all_plugins_play to load vars for managed_node3 9733 1726773066.42142: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773066.42145: Calling groups_plugins_play to load vars for managed_node3 9733 1726773066.42305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773066.42503: done with get_vars() 9733 1726773066.42514: done getting variables 9733 1726773066.42569: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Reset settings success flag] ********************************************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings.yml:13 Thursday 19 September 2024 15:11:06 -0400 (0:00:00.038) 0:00:12.158 **** 9733 1726773066.42597: entering _queue_task() for managed_node3/set_fact 9733 1726773066.42830: worker is 1 (out of 1 available) 9733 1726773066.42844: exiting _queue_task() for managed_node3/set_fact 9733 1726773066.42856: done queuing things up, now waiting for results queue to drain 9733 1726773066.42858: waiting for pending results... 10198 1726773066.43073: running TaskExecutor() for managed_node3/TASK: Reset settings success flag 10198 1726773066.43196: in run() - task 0affffe7-6841-7dd6-8fa6-000000000188 10198 1726773066.43215: variable 'ansible_search_path' from source: unknown 10198 1726773066.43219: variable 'ansible_search_path' from source: unknown 10198 1726773066.43250: calling self._execute() 10198 1726773066.43329: variable 'ansible_host' from source: host vars for 'managed_node3' 10198 1726773066.43339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10198 1726773066.43347: variable 'omit' from source: magic vars 10198 1726773066.43443: variable 'omit' from source: magic vars 10198 1726773066.43483: variable 'omit' from source: magic vars 10198 1726773066.43516: variable 'omit' from source: magic vars 10198 1726773066.43559: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10198 1726773066.43595: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10198 1726773066.43619: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10198 1726773066.43975: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10198 1726773066.43995: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10198 1726773066.44026: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10198 1726773066.44033: variable 'ansible_host' from source: host vars for 'managed_node3' 10198 1726773066.44037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10198 1726773066.44139: Set connection var ansible_timeout to 10 10198 1726773066.44145: Set connection var ansible_shell_type to sh 10198 1726773066.44151: Set connection var ansible_module_compression to ZIP_DEFLATED 10198 1726773066.44156: Set connection var ansible_shell_executable to /bin/sh 10198 1726773066.44161: Set connection var ansible_pipelining to False 10198 1726773066.44168: Set connection var ansible_connection to ssh 10198 1726773066.44190: variable 'ansible_shell_executable' from source: unknown 10198 1726773066.44195: variable 'ansible_connection' from source: unknown 10198 1726773066.44198: variable 'ansible_module_compression' from source: unknown 10198 1726773066.44201: variable 'ansible_shell_type' from source: unknown 10198 1726773066.44204: variable 'ansible_shell_executable' from source: unknown 10198 1726773066.44207: variable 'ansible_host' from source: host vars for 'managed_node3' 10198 1726773066.44211: variable 'ansible_pipelining' from source: unknown 10198 1726773066.44214: variable 'ansible_timeout' from source: unknown 10198 1726773066.44218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10198 1726773066.44346: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10198 1726773066.44360: variable 'omit' from source: magic vars 10198 1726773066.44366: starting attempt loop 10198 1726773066.44369: running the handler 10198 1726773066.44379: handler run complete 10198 1726773066.44390: attempt loop complete, returning result 10198 1726773066.44393: _execute() done 10198 1726773066.44396: dumping result to json 10198 1726773066.44398: done dumping result, returning 10198 1726773066.44403: done running TaskExecutor() for managed_node3/TASK: Reset settings success flag [0affffe7-6841-7dd6-8fa6-000000000188] 10198 1726773066.44408: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000188 10198 1726773066.44430: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000188 10198 1726773066.44433: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__kernel_settings_success": true }, "changed": false } 9733 1726773066.44821: no more pending results, returning what we have 9733 1726773066.44824: results queue empty 9733 1726773066.44825: checking for any_errors_fatal 9733 1726773066.44830: done checking for any_errors_fatal 9733 1726773066.44831: checking for max_fail_percentage 9733 1726773066.44833: done checking for max_fail_percentage 9733 1726773066.44833: checking to see if all hosts have failed and the running result is not ok 9733 1726773066.44834: done checking to see if all hosts have failed 9733 1726773066.44834: getting the remaining hosts for this loop 9733 1726773066.44836: done getting the remaining hosts for this loop 9733 1726773066.44839: getting the next task for host managed_node3 9733 1726773066.44845: done getting next task for host managed_node3 9733 1726773066.44847: ^ task is: TASK: Check tuned-adm verify 9733 1726773066.44850: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773066.44853: getting variables 9733 1726773066.44855: in VariableManager get_vars() 9733 1726773066.44888: Calling all_inventory to load vars for managed_node3 9733 1726773066.44892: Calling groups_inventory to load vars for managed_node3 9733 1726773066.44894: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773066.44904: Calling all_plugins_play to load vars for managed_node3 9733 1726773066.44907: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773066.44910: Calling groups_plugins_play to load vars for managed_node3 9733 1726773066.45320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773066.45504: done with get_vars() 9733 1726773066.45514: done getting variables 9733 1726773066.45567: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check tuned-adm verify] ************************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings.yml:17 Thursday 19 September 2024 15:11:06 -0400 (0:00:00.029) 0:00:12.188 **** 9733 1726773066.45595: entering _queue_task() for managed_node3/command 9733 1726773066.45835: worker is 1 (out of 1 available) 9733 1726773066.45848: exiting _queue_task() for managed_node3/command 9733 1726773066.45859: done queuing things up, now waiting for results queue to drain 9733 1726773066.45861: waiting for pending results... 10199 1726773066.46070: running TaskExecutor() for managed_node3/TASK: Check tuned-adm verify 10199 1726773066.46193: in run() - task 0affffe7-6841-7dd6-8fa6-000000000189 10199 1726773066.46211: variable 'ansible_search_path' from source: unknown 10199 1726773066.46215: variable 'ansible_search_path' from source: unknown 10199 1726773066.46241: calling self._execute() 10199 1726773066.46304: variable 'ansible_host' from source: host vars for 'managed_node3' 10199 1726773066.46312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10199 1726773066.46317: variable 'omit' from source: magic vars 10199 1726773066.46670: variable '__kernel_settings_test_verify' from source: include params 10199 1726773066.46683: Evaluated conditional (__kernel_settings_test_verify): False 10199 1726773066.46690: when evaluation is False, skipping this task 10199 1726773066.46693: _execute() done 10199 1726773066.46696: dumping result to json 10199 1726773066.46699: done dumping result, returning 10199 1726773066.46705: done running TaskExecutor() for managed_node3/TASK: Check tuned-adm verify [0affffe7-6841-7dd6-8fa6-000000000189] 10199 1726773066.46711: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000189 10199 1726773066.46739: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000189 10199 1726773066.46743: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_test_verify", "skip_reason": "Conditional result was False" } 9733 1726773066.47096: no more pending results, returning what we have 9733 1726773066.47099: results queue empty 9733 1726773066.47100: checking for any_errors_fatal 9733 1726773066.47105: done checking for any_errors_fatal 9733 1726773066.47105: checking for max_fail_percentage 9733 1726773066.47107: done checking for max_fail_percentage 9733 1726773066.47108: checking to see if all hosts have failed and the running result is not ok 9733 1726773066.47108: done checking to see if all hosts have failed 9733 1726773066.47109: getting the remaining hosts for this loop 9733 1726773066.47110: done getting the remaining hosts for this loop 9733 1726773066.47113: getting the next task for host managed_node3 9733 1726773066.47119: done getting next task for host managed_node3 9733 1726773066.47121: ^ task is: TASK: Check for verify errors 9733 1726773066.47123: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773066.47127: getting variables 9733 1726773066.47128: in VariableManager get_vars() 9733 1726773066.47158: Calling all_inventory to load vars for managed_node3 9733 1726773066.47161: Calling groups_inventory to load vars for managed_node3 9733 1726773066.47163: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773066.47175: Calling all_plugins_play to load vars for managed_node3 9733 1726773066.47178: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773066.47181: Calling groups_plugins_play to load vars for managed_node3 9733 1726773066.47346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773066.47550: done with get_vars() 9733 1726773066.47560: done getting variables 9733 1726773066.47619: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check for verify errors] ************************************************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings.yml:24 Thursday 19 September 2024 15:11:06 -0400 (0:00:00.020) 0:00:12.208 **** 9733 1726773066.47644: entering _queue_task() for managed_node3/command 9733 1726773066.47855: worker is 1 (out of 1 available) 9733 1726773066.47869: exiting _queue_task() for managed_node3/command 9733 1726773066.47882: done queuing things up, now waiting for results queue to drain 9733 1726773066.47884: waiting for pending results... 10202 1726773066.48095: running TaskExecutor() for managed_node3/TASK: Check for verify errors 10202 1726773066.48221: in run() - task 0affffe7-6841-7dd6-8fa6-00000000018a 10202 1726773066.48241: variable 'ansible_search_path' from source: unknown 10202 1726773066.48245: variable 'ansible_search_path' from source: unknown 10202 1726773066.48279: calling self._execute() 10202 1726773066.48362: variable 'ansible_host' from source: host vars for 'managed_node3' 10202 1726773066.48376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10202 1726773066.48387: variable 'omit' from source: magic vars 10202 1726773066.48915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10202 1726773066.49167: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10202 1726773066.49215: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10202 1726773066.49248: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10202 1726773066.49287: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10202 1726773066.49362: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10202 1726773066.49392: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10202 1726773066.49417: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10202 1726773066.49442: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10202 1726773066.49552: variable '__kernel_settings_register_verify' from source: set_fact 10202 1726773066.49568: Evaluated conditional (__kernel_settings_register_verify is defined): True 10202 1726773066.49697: variable '__kernel_settings_register_verify' from source: set_fact 10202 1726773066.49708: Evaluated conditional (__kernel_settings_register_verify is failed): False 10202 1726773066.49713: when evaluation is False, skipping this task 10202 1726773066.49716: _execute() done 10202 1726773066.49719: dumping result to json 10202 1726773066.49722: done dumping result, returning 10202 1726773066.49728: done running TaskExecutor() for managed_node3/TASK: Check for verify errors [0affffe7-6841-7dd6-8fa6-00000000018a] 10202 1726773066.49733: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000018a 10202 1726773066.49761: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000018a 10202 1726773066.49764: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_register_verify is failed", "skip_reason": "Conditional result was False" } 9733 1726773066.50148: no more pending results, returning what we have 9733 1726773066.50151: results queue empty 9733 1726773066.50152: checking for any_errors_fatal 9733 1726773066.50156: done checking for any_errors_fatal 9733 1726773066.50157: checking for max_fail_percentage 9733 1726773066.50159: done checking for max_fail_percentage 9733 1726773066.50159: checking to see if all hosts have failed and the running result is not ok 9733 1726773066.50160: done checking to see if all hosts have failed 9733 1726773066.50161: getting the remaining hosts for this loop 9733 1726773066.50162: done getting the remaining hosts for this loop 9733 1726773066.50164: getting the next task for host managed_node3 9733 1726773066.50172: done getting next task for host managed_node3 9733 1726773066.50174: ^ task is: TASK: Check /proc/cmdline 9733 1726773066.50177: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773066.50180: getting variables 9733 1726773066.50181: in VariableManager get_vars() 9733 1726773066.50213: Calling all_inventory to load vars for managed_node3 9733 1726773066.50216: Calling groups_inventory to load vars for managed_node3 9733 1726773066.50219: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773066.50229: Calling all_plugins_play to load vars for managed_node3 9733 1726773066.50231: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773066.50234: Calling groups_plugins_play to load vars for managed_node3 9733 1726773066.50450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773066.50640: done with get_vars() 9733 1726773066.50652: done getting variables 9733 1726773066.50714: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check /proc/cmdline] ***************************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings.yml:31 Thursday 19 September 2024 15:11:06 -0400 (0:00:00.030) 0:00:12.239 **** 9733 1726773066.50745: entering _queue_task() for managed_node3/command 9733 1726773066.51013: worker is 1 (out of 1 available) 9733 1726773066.51027: exiting _queue_task() for managed_node3/command 9733 1726773066.51040: done queuing things up, now waiting for results queue to drain 9733 1726773066.51042: waiting for pending results... 10203 1726773066.51487: running TaskExecutor() for managed_node3/TASK: Check /proc/cmdline 10203 1726773066.51617: in run() - task 0affffe7-6841-7dd6-8fa6-00000000018b 10203 1726773066.51637: variable 'ansible_search_path' from source: unknown 10203 1726773066.51641: variable 'ansible_search_path' from source: unknown 10203 1726773066.51676: calling self._execute() 10203 1726773066.51760: variable 'ansible_host' from source: host vars for 'managed_node3' 10203 1726773066.51768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10203 1726773066.51779: variable 'omit' from source: magic vars 10203 1726773066.52209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10203 1726773066.52494: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10203 1726773066.52534: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10203 1726773066.52567: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10203 1726773066.52603: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10203 1726773066.52712: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10203 1726773066.52735: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10203 1726773066.52758: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10203 1726773066.52784: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10203 1726773066.52895: variable '__kernel_settings_register_verify' from source: set_fact 10203 1726773066.52911: Evaluated conditional (__kernel_settings_register_verify is defined): True 10203 1726773066.53036: variable '__kernel_settings_register_verify' from source: set_fact 10203 1726773066.53046: Evaluated conditional (__kernel_settings_register_verify is failed): False 10203 1726773066.53051: when evaluation is False, skipping this task 10203 1726773066.53053: _execute() done 10203 1726773066.53056: dumping result to json 10203 1726773066.53059: done dumping result, returning 10203 1726773066.53065: done running TaskExecutor() for managed_node3/TASK: Check /proc/cmdline [0affffe7-6841-7dd6-8fa6-00000000018b] 10203 1726773066.53072: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000018b 10203 1726773066.53104: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000018b 10203 1726773066.53107: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_register_verify is failed", "skip_reason": "Conditional result was False" } 9733 1726773066.53527: no more pending results, returning what we have 9733 1726773066.53530: results queue empty 9733 1726773066.53531: checking for any_errors_fatal 9733 1726773066.53536: done checking for any_errors_fatal 9733 1726773066.53537: checking for max_fail_percentage 9733 1726773066.53538: done checking for max_fail_percentage 9733 1726773066.53539: checking to see if all hosts have failed and the running result is not ok 9733 1726773066.53540: done checking to see if all hosts have failed 9733 1726773066.53541: getting the remaining hosts for this loop 9733 1726773066.53542: done getting the remaining hosts for this loop 9733 1726773066.53545: getting the next task for host managed_node3 9733 1726773066.53551: done getting next task for host managed_node3 9733 1726773066.53553: ^ task is: TASK: Set error flag based on verify 9733 1726773066.53556: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773066.53559: getting variables 9733 1726773066.53560: in VariableManager get_vars() 9733 1726773066.53596: Calling all_inventory to load vars for managed_node3 9733 1726773066.53599: Calling groups_inventory to load vars for managed_node3 9733 1726773066.53601: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773066.53611: Calling all_plugins_play to load vars for managed_node3 9733 1726773066.53613: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773066.53616: Calling groups_plugins_play to load vars for managed_node3 9733 1726773066.53790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773066.54003: done with get_vars() 9733 1726773066.54016: done getting variables 9733 1726773066.54076: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set error flag based on verify] ****************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings.yml:38 Thursday 19 September 2024 15:11:06 -0400 (0:00:00.033) 0:00:12.273 **** 9733 1726773066.54107: entering _queue_task() for managed_node3/set_fact 9733 1726773066.54354: worker is 1 (out of 1 available) 9733 1726773066.54366: exiting _queue_task() for managed_node3/set_fact 9733 1726773066.54380: done queuing things up, now waiting for results queue to drain 9733 1726773066.54383: waiting for pending results... 10204 1726773066.54604: running TaskExecutor() for managed_node3/TASK: Set error flag based on verify 10204 1726773066.54729: in run() - task 0affffe7-6841-7dd6-8fa6-00000000018c 10204 1726773066.54751: variable 'ansible_search_path' from source: unknown 10204 1726773066.54756: variable 'ansible_search_path' from source: unknown 10204 1726773066.54794: calling self._execute() 10204 1726773066.54879: variable 'ansible_host' from source: host vars for 'managed_node3' 10204 1726773066.54891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10204 1726773066.54900: variable 'omit' from source: magic vars 10204 1726773066.55411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10204 1726773066.55703: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10204 1726773066.55747: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10204 1726773066.55782: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10204 1726773066.55818: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10204 1726773066.55900: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10204 1726773066.55927: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10204 1726773066.55956: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10204 1726773066.55988: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10204 1726773066.56103: variable '__kernel_settings_register_verify' from source: set_fact 10204 1726773066.56118: Evaluated conditional (__kernel_settings_register_verify is defined): True 10204 1726773066.56245: variable '__kernel_settings_register_verify' from source: set_fact 10204 1726773066.56256: Evaluated conditional (__kernel_settings_register_verify is failed): False 10204 1726773066.56261: when evaluation is False, skipping this task 10204 1726773066.56264: _execute() done 10204 1726773066.56267: dumping result to json 10204 1726773066.56272: done dumping result, returning 10204 1726773066.56279: done running TaskExecutor() for managed_node3/TASK: Set error flag based on verify [0affffe7-6841-7dd6-8fa6-00000000018c] 10204 1726773066.56284: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000018c 10204 1726773066.56317: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000018c 10204 1726773066.56320: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_register_verify is failed", "skip_reason": "Conditional result was False" } 9733 1726773066.56716: no more pending results, returning what we have 9733 1726773066.56719: results queue empty 9733 1726773066.56720: checking for any_errors_fatal 9733 1726773066.56724: done checking for any_errors_fatal 9733 1726773066.56725: checking for max_fail_percentage 9733 1726773066.56726: done checking for max_fail_percentage 9733 1726773066.56727: checking to see if all hosts have failed and the running result is not ok 9733 1726773066.56728: done checking to see if all hosts have failed 9733 1726773066.56728: getting the remaining hosts for this loop 9733 1726773066.56730: done getting the remaining hosts for this loop 9733 1726773066.56733: getting the next task for host managed_node3 9733 1726773066.56739: done getting next task for host managed_node3 9733 1726773066.56742: ^ task is: TASK: Check config files 9733 1726773066.56745: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773066.56748: getting variables 9733 1726773066.56750: in VariableManager get_vars() 9733 1726773066.56784: Calling all_inventory to load vars for managed_node3 9733 1726773066.56789: Calling groups_inventory to load vars for managed_node3 9733 1726773066.56790: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773066.56801: Calling all_plugins_play to load vars for managed_node3 9733 1726773066.56803: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773066.56806: Calling groups_plugins_play to load vars for managed_node3 9733 1726773066.57036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773066.57216: done with get_vars() 9733 1726773066.57227: done getting variables TASK [Check config files] ****************************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings.yml:45 Thursday 19 September 2024 15:11:06 -0400 (0:00:00.032) 0:00:12.305 **** 9733 1726773066.57318: entering _queue_task() for managed_node3/include_tasks 9733 1726773066.57559: worker is 1 (out of 1 available) 9733 1726773066.57575: exiting _queue_task() for managed_node3/include_tasks 9733 1726773066.57589: done queuing things up, now waiting for results queue to drain 9733 1726773066.57591: waiting for pending results... 10205 1726773066.57806: running TaskExecutor() for managed_node3/TASK: Check config files 10205 1726773066.57929: in run() - task 0affffe7-6841-7dd6-8fa6-00000000018d 10205 1726773066.57949: variable 'ansible_search_path' from source: unknown 10205 1726773066.57954: variable 'ansible_search_path' from source: unknown 10205 1726773066.57992: calling self._execute() 10205 1726773066.58081: variable 'ansible_host' from source: host vars for 'managed_node3' 10205 1726773066.58093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10205 1726773066.58102: variable 'omit' from source: magic vars 10205 1726773066.58539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10205 1726773066.58835: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10205 1726773066.58880: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10205 1726773066.58915: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10205 1726773066.58946: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10205 1726773066.59054: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10205 1726773066.59082: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10205 1726773066.59109: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10205 1726773066.59133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10205 1726773066.59244: variable '__kernel_settings_profile_file' from source: include_vars 10205 1726773066.59391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10205 1726773066.61375: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10205 1726773066.61436: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10205 1726773066.61463: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10205 1726773066.61494: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10205 1726773066.61518: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10205 1726773066.61557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10205 1726773066.61576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10205 1726773066.61593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10205 1726773066.61616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10205 1726773066.61624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10205 1726773066.61705: Evaluated conditional (__kernel_settings_profile_file is defined): True 10205 1726773066.61714: _execute() done 10205 1726773066.61718: dumping result to json 10205 1726773066.61722: done dumping result, returning 10205 1726773066.61728: done running TaskExecutor() for managed_node3/TASK: Check config files [0affffe7-6841-7dd6-8fa6-00000000018d] 10205 1726773066.61733: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000018d 10205 1726773066.61758: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000018d 10205 1726773066.61761: WORKER PROCESS EXITING 9733 1726773066.61879: no more pending results, returning what we have 9733 1726773066.61884: in VariableManager get_vars() 9733 1726773066.61919: Calling all_inventory to load vars for managed_node3 9733 1726773066.61922: Calling groups_inventory to load vars for managed_node3 9733 1726773066.61924: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773066.61934: Calling all_plugins_play to load vars for managed_node3 9733 1726773066.61936: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773066.61938: Calling groups_plugins_play to load vars for managed_node3 9733 1726773066.62107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773066.62300: done with get_vars() 9733 1726773066.62308: variable 'ansible_search_path' from source: unknown 9733 1726773066.62309: variable 'ansible_search_path' from source: unknown 9733 1726773066.62348: we have included files to process 9733 1726773066.62349: generating all_blocks data 9733 1726773066.62351: done generating all_blocks data 9733 1726773066.62358: processing included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings_conf_files.yml 9733 1726773066.62359: loading included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings_conf_files.yml 9733 1726773066.62361: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings_conf_files.yml included: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings_conf_files.yml for managed_node3 9733 1726773066.63729: done processing included file 9733 1726773066.63732: iterating over new_blocks loaded from include file 9733 1726773066.63733: in VariableManager get_vars() 9733 1726773066.63752: done with get_vars() 9733 1726773066.63754: filtering new block on tags 9733 1726773066.63881: done filtering new block on tags 9733 1726773066.63884: done iterating over new_blocks loaded from include file 9733 1726773066.63889: extending task lists for all hosts with included blocks 9733 1726773066.64255: done extending task lists 9733 1726773066.64257: done processing included files 9733 1726773066.64258: results queue empty 9733 1726773066.64258: checking for any_errors_fatal 9733 1726773066.64261: done checking for any_errors_fatal 9733 1726773066.64262: checking for max_fail_percentage 9733 1726773066.64263: done checking for max_fail_percentage 9733 1726773066.64264: checking to see if all hosts have failed and the running result is not ok 9733 1726773066.64264: done checking to see if all hosts have failed 9733 1726773066.64265: getting the remaining hosts for this loop 9733 1726773066.64266: done getting the remaining hosts for this loop 9733 1726773066.64268: getting the next task for host managed_node3 9733 1726773066.64272: done getting next task for host managed_node3 9733 1726773066.64279: ^ task is: TASK: Create temporary file for expected config 9733 1726773066.64281: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773066.64284: getting variables 9733 1726773066.64284: in VariableManager get_vars() 9733 1726773066.64298: Calling all_inventory to load vars for managed_node3 9733 1726773066.64300: Calling groups_inventory to load vars for managed_node3 9733 1726773066.64302: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773066.64308: Calling all_plugins_play to load vars for managed_node3 9733 1726773066.64310: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773066.64313: Calling groups_plugins_play to load vars for managed_node3 9733 1726773066.64454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773066.64644: done with get_vars() 9733 1726773066.64653: done getting variables TASK [Create temporary file for expected config] ******************************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings_conf_files.yml:2 Thursday 19 September 2024 15:11:06 -0400 (0:00:00.074) 0:00:12.379 **** 9733 1726773066.64719: entering _queue_task() for managed_node3/tempfile 9733 1726773066.64721: Creating lock for tempfile 9733 1726773066.64988: worker is 1 (out of 1 available) 9733 1726773066.65006: exiting _queue_task() for managed_node3/tempfile 9733 1726773066.65020: done queuing things up, now waiting for results queue to drain 9733 1726773066.65023: waiting for pending results... 10228 1726773066.65209: running TaskExecutor() for managed_node3/TASK: Create temporary file for expected config 10228 1726773066.65328: in run() - task 0affffe7-6841-7dd6-8fa6-000000000227 10228 1726773066.65346: variable 'ansible_search_path' from source: unknown 10228 1726773066.65349: variable 'ansible_search_path' from source: unknown 10228 1726773066.65380: calling self._execute() 10228 1726773066.65450: variable 'ansible_host' from source: host vars for 'managed_node3' 10228 1726773066.65459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10228 1726773066.65467: variable 'omit' from source: magic vars 10228 1726773066.65577: variable 'omit' from source: magic vars 10228 1726773066.65622: variable 'omit' from source: magic vars 10228 1726773066.65649: variable 'omit' from source: magic vars 10228 1726773066.65682: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10228 1726773066.65711: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10228 1726773066.65729: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10228 1726773066.65741: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10228 1726773066.65751: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10228 1726773066.65774: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10228 1726773066.65779: variable 'ansible_host' from source: host vars for 'managed_node3' 10228 1726773066.65782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10228 1726773066.65850: Set connection var ansible_timeout to 10 10228 1726773066.65854: Set connection var ansible_shell_type to sh 10228 1726773066.65858: Set connection var ansible_module_compression to ZIP_DEFLATED 10228 1726773066.65861: Set connection var ansible_shell_executable to /bin/sh 10228 1726773066.65864: Set connection var ansible_pipelining to False 10228 1726773066.65868: Set connection var ansible_connection to ssh 10228 1726773066.65883: variable 'ansible_shell_executable' from source: unknown 10228 1726773066.65890: variable 'ansible_connection' from source: unknown 10228 1726773066.65894: variable 'ansible_module_compression' from source: unknown 10228 1726773066.65895: variable 'ansible_shell_type' from source: unknown 10228 1726773066.65897: variable 'ansible_shell_executable' from source: unknown 10228 1726773066.65899: variable 'ansible_host' from source: host vars for 'managed_node3' 10228 1726773066.65901: variable 'ansible_pipelining' from source: unknown 10228 1726773066.65902: variable 'ansible_timeout' from source: unknown 10228 1726773066.65905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10228 1726773066.66057: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10228 1726773066.66070: variable 'omit' from source: magic vars 10228 1726773066.66077: starting attempt loop 10228 1726773066.66081: running the handler 10228 1726773066.66094: _low_level_execute_command(): starting 10228 1726773066.66103: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10228 1726773066.68490: stdout chunk (state=2): >>>/root <<< 10228 1726773066.68608: stderr chunk (state=3): >>><<< 10228 1726773066.68615: stdout chunk (state=3): >>><<< 10228 1726773066.68635: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10228 1726773066.68648: _low_level_execute_command(): starting 10228 1726773066.68655: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773066.6864326-10228-84151641337223 `" && echo ansible-tmp-1726773066.6864326-10228-84151641337223="` echo /root/.ansible/tmp/ansible-tmp-1726773066.6864326-10228-84151641337223 `" ) && sleep 0' 10228 1726773066.71126: stdout chunk (state=2): >>>ansible-tmp-1726773066.6864326-10228-84151641337223=/root/.ansible/tmp/ansible-tmp-1726773066.6864326-10228-84151641337223 <<< 10228 1726773066.71260: stderr chunk (state=3): >>><<< 10228 1726773066.71268: stdout chunk (state=3): >>><<< 10228 1726773066.71290: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773066.6864326-10228-84151641337223=/root/.ansible/tmp/ansible-tmp-1726773066.6864326-10228-84151641337223 , stderr= 10228 1726773066.71328: variable 'ansible_module_compression' from source: unknown 10228 1726773066.71367: ANSIBALLZ: Using lock for tempfile 10228 1726773066.71374: ANSIBALLZ: Acquiring lock 10228 1726773066.71377: ANSIBALLZ: Lock acquired: 139792131224832 10228 1726773066.71381: ANSIBALLZ: Creating module 10228 1726773066.80341: ANSIBALLZ: Writing module into payload 10228 1726773066.80395: ANSIBALLZ: Writing module 10228 1726773066.80414: ANSIBALLZ: Renaming module 10228 1726773066.80422: ANSIBALLZ: Done creating module 10228 1726773066.80437: variable 'ansible_facts' from source: unknown 10228 1726773066.80492: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773066.6864326-10228-84151641337223/AnsiballZ_tempfile.py 10228 1726773066.80592: Sending initial data 10228 1726773066.80599: Sent initial data (155 bytes) 10228 1726773066.83257: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpu701b9ls /root/.ansible/tmp/ansible-tmp-1726773066.6864326-10228-84151641337223/AnsiballZ_tempfile.py <<< 10228 1726773066.84432: stderr chunk (state=3): >>><<< 10228 1726773066.84442: stdout chunk (state=3): >>><<< 10228 1726773066.84461: done transferring module to remote 10228 1726773066.84474: _low_level_execute_command(): starting 10228 1726773066.84480: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773066.6864326-10228-84151641337223/ /root/.ansible/tmp/ansible-tmp-1726773066.6864326-10228-84151641337223/AnsiballZ_tempfile.py && sleep 0' 10228 1726773066.86931: stderr chunk (state=2): >>><<< 10228 1726773066.86940: stdout chunk (state=2): >>><<< 10228 1726773066.86956: _low_level_execute_command() done: rc=0, stdout=, stderr= 10228 1726773066.86960: _low_level_execute_command(): starting 10228 1726773066.86966: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773066.6864326-10228-84151641337223/AnsiballZ_tempfile.py && sleep 0' 10228 1726773067.01811: stdout chunk (state=2): >>> {"changed": true, "path": "/tmp/ansible.ur0ymozm.kernel_settings", "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "invocation": {"module_args": {"state": "file", "suffix": ".kernel_settings", "prefix": "ansible.", "path": null}}} <<< 10228 1726773067.02855: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10228 1726773067.02906: stderr chunk (state=3): >>><<< 10228 1726773067.02913: stdout chunk (state=3): >>><<< 10228 1726773067.02929: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "path": "/tmp/ansible.ur0ymozm.kernel_settings", "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "invocation": {"module_args": {"state": "file", "suffix": ".kernel_settings", "prefix": "ansible.", "path": null}}} , stderr=Shared connection to 10.31.47.99 closed. 10228 1726773067.02966: done with _execute_module (tempfile, {'state': 'file', 'suffix': '.kernel_settings', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'tempfile', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773066.6864326-10228-84151641337223/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10228 1726773067.02980: _low_level_execute_command(): starting 10228 1726773067.02988: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773066.6864326-10228-84151641337223/ > /dev/null 2>&1 && sleep 0' 10228 1726773067.05436: stderr chunk (state=2): >>><<< 10228 1726773067.05445: stdout chunk (state=2): >>><<< 10228 1726773067.05459: _low_level_execute_command() done: rc=0, stdout=, stderr= 10228 1726773067.05466: handler run complete 10228 1726773067.05483: attempt loop complete, returning result 10228 1726773067.05489: _execute() done 10228 1726773067.05493: dumping result to json 10228 1726773067.05498: done dumping result, returning 10228 1726773067.05505: done running TaskExecutor() for managed_node3/TASK: Create temporary file for expected config [0affffe7-6841-7dd6-8fa6-000000000227] 10228 1726773067.05511: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000227 10228 1726773067.05544: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000227 10228 1726773067.05548: WORKER PROCESS EXITING changed: [managed_node3] => { "changed": true, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/ansible.ur0ymozm.kernel_settings", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } 9733 1726773067.05691: no more pending results, returning what we have 9733 1726773067.05693: results queue empty 9733 1726773067.05694: checking for any_errors_fatal 9733 1726773067.05696: done checking for any_errors_fatal 9733 1726773067.05696: checking for max_fail_percentage 9733 1726773067.05697: done checking for max_fail_percentage 9733 1726773067.05698: checking to see if all hosts have failed and the running result is not ok 9733 1726773067.05699: done checking to see if all hosts have failed 9733 1726773067.05699: getting the remaining hosts for this loop 9733 1726773067.05700: done getting the remaining hosts for this loop 9733 1726773067.05703: getting the next task for host managed_node3 9733 1726773067.05708: done getting next task for host managed_node3 9733 1726773067.05710: ^ task is: TASK: Put expected contents into temporary file 9733 1726773067.05713: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773067.05716: getting variables 9733 1726773067.05717: in VariableManager get_vars() 9733 1726773067.05788: Calling all_inventory to load vars for managed_node3 9733 1726773067.05791: Calling groups_inventory to load vars for managed_node3 9733 1726773067.05794: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773067.05804: Calling all_plugins_play to load vars for managed_node3 9733 1726773067.05806: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773067.05809: Calling groups_plugins_play to load vars for managed_node3 9733 1726773067.05932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773067.06052: done with get_vars() 9733 1726773067.06062: done getting variables 9733 1726773067.06117: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Put expected contents into temporary file] ******************************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings_conf_files.yml:8 Thursday 19 September 2024 15:11:07 -0400 (0:00:00.414) 0:00:12.793 **** 9733 1726773067.06138: entering _queue_task() for managed_node3/copy 9733 1726773067.06313: worker is 1 (out of 1 available) 9733 1726773067.06333: exiting _queue_task() for managed_node3/copy 9733 1726773067.06345: done queuing things up, now waiting for results queue to drain 9733 1726773067.06347: waiting for pending results... 10241 1726773067.06472: running TaskExecutor() for managed_node3/TASK: Put expected contents into temporary file 10241 1726773067.06579: in run() - task 0affffe7-6841-7dd6-8fa6-000000000228 10241 1726773067.06595: variable 'ansible_search_path' from source: unknown 10241 1726773067.06599: variable 'ansible_search_path' from source: unknown 10241 1726773067.06628: calling self._execute() 10241 1726773067.06696: variable 'ansible_host' from source: host vars for 'managed_node3' 10241 1726773067.06706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10241 1726773067.06715: variable 'omit' from source: magic vars 10241 1726773067.06790: variable 'omit' from source: magic vars 10241 1726773067.06827: variable 'omit' from source: magic vars 10241 1726773067.07064: variable '__kernel_settings_register_profile_conf_tempfile' from source: set_fact 10241 1726773067.07093: variable '__kernel_settings_profile_file' from source: include_vars 10241 1726773067.07148: variable '__kernel_settings_profile_file' from source: include_vars 10241 1726773067.07287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10241 1726773067.08823: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10241 1726773067.08874: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10241 1726773067.08904: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10241 1726773067.08930: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10241 1726773067.08951: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10241 1726773067.09019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10241 1726773067.09041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10241 1726773067.09063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10241 1726773067.09095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10241 1726773067.09107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10241 1726773067.09196: variable 'omit' from source: magic vars 10241 1726773067.09218: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10241 1726773067.09237: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10241 1726773067.09253: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10241 1726773067.09266: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10241 1726773067.09278: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10241 1726773067.09303: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10241 1726773067.09309: variable 'ansible_host' from source: host vars for 'managed_node3' 10241 1726773067.09314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10241 1726773067.09380: Set connection var ansible_timeout to 10 10241 1726773067.09387: Set connection var ansible_shell_type to sh 10241 1726773067.09392: Set connection var ansible_module_compression to ZIP_DEFLATED 10241 1726773067.09396: Set connection var ansible_shell_executable to /bin/sh 10241 1726773067.09402: Set connection var ansible_pipelining to False 10241 1726773067.09409: Set connection var ansible_connection to ssh 10241 1726773067.09426: variable 'ansible_shell_executable' from source: unknown 10241 1726773067.09430: variable 'ansible_connection' from source: unknown 10241 1726773067.09433: variable 'ansible_module_compression' from source: unknown 10241 1726773067.09436: variable 'ansible_shell_type' from source: unknown 10241 1726773067.09440: variable 'ansible_shell_executable' from source: unknown 10241 1726773067.09443: variable 'ansible_host' from source: host vars for 'managed_node3' 10241 1726773067.09448: variable 'ansible_pipelining' from source: unknown 10241 1726773067.09451: variable 'ansible_timeout' from source: unknown 10241 1726773067.09455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10241 1726773067.09520: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10241 1726773067.09530: variable 'omit' from source: magic vars 10241 1726773067.09535: starting attempt loop 10241 1726773067.09537: running the handler 10241 1726773067.09546: _low_level_execute_command(): starting 10241 1726773067.09552: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10241 1726773067.11887: stdout chunk (state=2): >>>/root <<< 10241 1726773067.12007: stderr chunk (state=3): >>><<< 10241 1726773067.12014: stdout chunk (state=3): >>><<< 10241 1726773067.12031: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10241 1726773067.12043: _low_level_execute_command(): starting 10241 1726773067.12049: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773067.1203895-10241-212922774866028 `" && echo ansible-tmp-1726773067.1203895-10241-212922774866028="` echo /root/.ansible/tmp/ansible-tmp-1726773067.1203895-10241-212922774866028 `" ) && sleep 0' 10241 1726773067.14697: stdout chunk (state=2): >>>ansible-tmp-1726773067.1203895-10241-212922774866028=/root/.ansible/tmp/ansible-tmp-1726773067.1203895-10241-212922774866028 <<< 10241 1726773067.15035: stderr chunk (state=3): >>><<< 10241 1726773067.15046: stdout chunk (state=3): >>><<< 10241 1726773067.15073: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773067.1203895-10241-212922774866028=/root/.ansible/tmp/ansible-tmp-1726773067.1203895-10241-212922774866028 , stderr= 10241 1726773067.15175: variable 'ansible_module_compression' from source: unknown 10241 1726773067.15240: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10241 1726773067.15281: variable 'ansible_facts' from source: unknown 10241 1726773067.15374: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773067.1203895-10241-212922774866028/AnsiballZ_stat.py 10241 1726773067.15699: Sending initial data 10241 1726773067.15705: Sent initial data (152 bytes) 10241 1726773067.18151: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpmighnydo /root/.ansible/tmp/ansible-tmp-1726773067.1203895-10241-212922774866028/AnsiballZ_stat.py <<< 10241 1726773067.19525: stderr chunk (state=3): >>><<< 10241 1726773067.19538: stdout chunk (state=3): >>><<< 10241 1726773067.19562: done transferring module to remote 10241 1726773067.19573: _low_level_execute_command(): starting 10241 1726773067.19578: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773067.1203895-10241-212922774866028/ /root/.ansible/tmp/ansible-tmp-1726773067.1203895-10241-212922774866028/AnsiballZ_stat.py && sleep 0' 10241 1726773067.22021: stderr chunk (state=2): >>><<< 10241 1726773067.22032: stdout chunk (state=2): >>><<< 10241 1726773067.22048: _low_level_execute_command() done: rc=0, stdout=, stderr= 10241 1726773067.22054: _low_level_execute_command(): starting 10241 1726773067.22059: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773067.1203895-10241-212922774866028/AnsiballZ_stat.py && sleep 0' 10241 1726773067.38107: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/tmp/ansible.ur0ymozm.kernel_settings", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 736921, "dev": 51713, "nlink": 1, "atime": 1726773067.0151148, "mtime": 1726773067.0151148, "ctime": 1726773067.0151148, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "mimetype": "inode/x-empty", "charset": "binary", "version": "4171428263", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/tmp/ansible.ur0ymozm.kernel_settings", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 10241 1726773067.38919: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10241 1726773067.38931: stdout chunk (state=3): >>><<< 10241 1726773067.38941: stderr chunk (state=3): >>><<< 10241 1726773067.38955: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/tmp/ansible.ur0ymozm.kernel_settings", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 736921, "dev": 51713, "nlink": 1, "atime": 1726773067.0151148, "mtime": 1726773067.0151148, "ctime": 1726773067.0151148, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "mimetype": "inode/x-empty", "charset": "binary", "version": "4171428263", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/tmp/ansible.ur0ymozm.kernel_settings", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.47.99 closed. 10241 1726773067.39016: done with _execute_module (ansible.legacy.stat, {'path': '/tmp/ansible.ur0ymozm.kernel_settings', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773067.1203895-10241-212922774866028/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10241 1726773067.39483: Sending initial data 10241 1726773067.39492: Sent initial data (141 bytes) 10241 1726773067.42031: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpw2vv9va_ /root/.ansible/tmp/ansible-tmp-1726773067.1203895-10241-212922774866028/source <<< 10241 1726773067.42560: stderr chunk (state=3): >>><<< 10241 1726773067.42573: stdout chunk (state=3): >>><<< 10241 1726773067.42599: _low_level_execute_command(): starting 10241 1726773067.42607: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773067.1203895-10241-212922774866028/ /root/.ansible/tmp/ansible-tmp-1726773067.1203895-10241-212922774866028/source && sleep 0' 10241 1726773067.45217: stderr chunk (state=2): >>><<< 10241 1726773067.45229: stdout chunk (state=2): >>><<< 10241 1726773067.45247: _low_level_execute_command() done: rc=0, stdout=, stderr= 10241 1726773067.45275: variable 'ansible_module_compression' from source: unknown 10241 1726773067.45326: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 10241 1726773067.45346: variable 'ansible_facts' from source: unknown 10241 1726773067.45427: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773067.1203895-10241-212922774866028/AnsiballZ_copy.py 10241 1726773067.45895: Sending initial data 10241 1726773067.45902: Sent initial data (152 bytes) 10241 1726773067.48371: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpgbmbglql /root/.ansible/tmp/ansible-tmp-1726773067.1203895-10241-212922774866028/AnsiballZ_copy.py <<< 10241 1726773067.49974: stderr chunk (state=3): >>><<< 10241 1726773067.49986: stdout chunk (state=3): >>><<< 10241 1726773067.50004: done transferring module to remote 10241 1726773067.50013: _low_level_execute_command(): starting 10241 1726773067.50019: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773067.1203895-10241-212922774866028/ /root/.ansible/tmp/ansible-tmp-1726773067.1203895-10241-212922774866028/AnsiballZ_copy.py && sleep 0' 10241 1726773067.52348: stderr chunk (state=2): >>><<< 10241 1726773067.52358: stdout chunk (state=2): >>><<< 10241 1726773067.52375: _low_level_execute_command() done: rc=0, stdout=, stderr= 10241 1726773067.52380: _low_level_execute_command(): starting 10241 1726773067.52386: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773067.1203895-10241-212922774866028/AnsiballZ_copy.py && sleep 0' 10241 1726773067.68586: stdout chunk (state=2): >>> {"dest": "/tmp/ansible.ur0ymozm.kernel_settings", "src": "/root/.ansible/tmp/ansible-tmp-1726773067.1203895-10241-212922774866028/source", "md5sum": "b82bbe3a22d81e7a442595a59b2d0944", "checksum": "169e46a0442e2493edda7ef1d37dfc572d06525c", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 299, "invocation": {"module_args": {"dest": "/tmp/ansible.ur0ymozm.kernel_settings", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773067.1203895-10241-212922774866028/source", "_original_basename": "tmpw2vv9va_", "follow": false, "checksum": "169e46a0442e2493edda7ef1d37dfc572d06525c", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 10241 1726773067.69713: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10241 1726773067.69762: stderr chunk (state=3): >>><<< 10241 1726773067.69769: stdout chunk (state=3): >>><<< 10241 1726773067.69790: _low_level_execute_command() done: rc=0, stdout= {"dest": "/tmp/ansible.ur0ymozm.kernel_settings", "src": "/root/.ansible/tmp/ansible-tmp-1726773067.1203895-10241-212922774866028/source", "md5sum": "b82bbe3a22d81e7a442595a59b2d0944", "checksum": "169e46a0442e2493edda7ef1d37dfc572d06525c", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 299, "invocation": {"module_args": {"dest": "/tmp/ansible.ur0ymozm.kernel_settings", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773067.1203895-10241-212922774866028/source", "_original_basename": "tmpw2vv9va_", "follow": false, "checksum": "169e46a0442e2493edda7ef1d37dfc572d06525c", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.47.99 closed. 10241 1726773067.69817: done with _execute_module (ansible.legacy.copy, {'dest': '/tmp/ansible.ur0ymozm.kernel_settings', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726773067.1203895-10241-212922774866028/source', '_original_basename': 'tmpw2vv9va_', 'follow': False, 'checksum': '169e46a0442e2493edda7ef1d37dfc572d06525c', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773067.1203895-10241-212922774866028/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10241 1726773067.69828: _low_level_execute_command(): starting 10241 1726773067.69834: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773067.1203895-10241-212922774866028/ > /dev/null 2>&1 && sleep 0' 10241 1726773067.72272: stderr chunk (state=2): >>><<< 10241 1726773067.72281: stdout chunk (state=2): >>><<< 10241 1726773067.72296: _low_level_execute_command() done: rc=0, stdout=, stderr= 10241 1726773067.72304: handler run complete 10241 1726773067.72324: attempt loop complete, returning result 10241 1726773067.72328: _execute() done 10241 1726773067.72332: dumping result to json 10241 1726773067.72337: done dumping result, returning 10241 1726773067.72345: done running TaskExecutor() for managed_node3/TASK: Put expected contents into temporary file [0affffe7-6841-7dd6-8fa6-000000000228] 10241 1726773067.72350: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000228 10241 1726773067.72383: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000228 10241 1726773067.72389: WORKER PROCESS EXITING changed: [managed_node3] => { "changed": true, "checksum": "169e46a0442e2493edda7ef1d37dfc572d06525c", "dest": "/tmp/ansible.ur0ymozm.kernel_settings", "gid": 0, "group": "root", "md5sum": "b82bbe3a22d81e7a442595a59b2d0944", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 299, "src": "/root/.ansible/tmp/ansible-tmp-1726773067.1203895-10241-212922774866028/source", "state": "file", "uid": 0 } 9733 1726773067.72547: no more pending results, returning what we have 9733 1726773067.72549: results queue empty 9733 1726773067.72550: checking for any_errors_fatal 9733 1726773067.72557: done checking for any_errors_fatal 9733 1726773067.72558: checking for max_fail_percentage 9733 1726773067.72559: done checking for max_fail_percentage 9733 1726773067.72560: checking to see if all hosts have failed and the running result is not ok 9733 1726773067.72560: done checking to see if all hosts have failed 9733 1726773067.72561: getting the remaining hosts for this loop 9733 1726773067.72562: done getting the remaining hosts for this loop 9733 1726773067.72565: getting the next task for host managed_node3 9733 1726773067.72571: done getting next task for host managed_node3 9733 1726773067.72573: ^ task is: TASK: Ensure python command exists for tests below 9733 1726773067.72576: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773067.72579: getting variables 9733 1726773067.72581: in VariableManager get_vars() 9733 1726773067.72611: Calling all_inventory to load vars for managed_node3 9733 1726773067.72614: Calling groups_inventory to load vars for managed_node3 9733 1726773067.72616: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773067.72625: Calling all_plugins_play to load vars for managed_node3 9733 1726773067.72628: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773067.72630: Calling groups_plugins_play to load vars for managed_node3 9733 1726773067.72753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773067.72888: done with get_vars() 9733 1726773067.72897: done getting variables 9733 1726773067.72936: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure python command exists for tests below] **************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings_conf_files.yml:14 Thursday 19 September 2024 15:11:07 -0400 (0:00:00.668) 0:00:13.462 **** 9733 1726773067.72956: entering _queue_task() for managed_node3/package 9733 1726773067.73127: worker is 1 (out of 1 available) 9733 1726773067.73142: exiting _queue_task() for managed_node3/package 9733 1726773067.73157: done queuing things up, now waiting for results queue to drain 9733 1726773067.73159: waiting for pending results... 10299 1726773067.73281: running TaskExecutor() for managed_node3/TASK: Ensure python command exists for tests below 10299 1726773067.73391: in run() - task 0affffe7-6841-7dd6-8fa6-000000000229 10299 1726773067.73410: variable 'ansible_search_path' from source: unknown 10299 1726773067.73414: variable 'ansible_search_path' from source: unknown 10299 1726773067.73443: calling self._execute() 10299 1726773067.73510: variable 'ansible_host' from source: host vars for 'managed_node3' 10299 1726773067.73519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10299 1726773067.73525: variable 'omit' from source: magic vars 10299 1726773067.73605: variable 'omit' from source: magic vars 10299 1726773067.73640: variable 'omit' from source: magic vars 10299 1726773067.73660: variable '__kernel_settings_test_python_pkgs' from source: include_vars 10299 1726773067.73909: variable '__kernel_settings_test_python_pkgs' from source: include_vars 10299 1726773067.74131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10299 1726773067.75860: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10299 1726773067.75921: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10299 1726773067.75951: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10299 1726773067.75978: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10299 1726773067.76000: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10299 1726773067.76070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10299 1726773067.76094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10299 1726773067.76112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10299 1726773067.76138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10299 1726773067.76147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10299 1726773067.76226: variable '__kernel_settings_is_ostree' from source: set_fact 10299 1726773067.76232: variable 'omit' from source: magic vars 10299 1726773067.76253: variable 'omit' from source: magic vars 10299 1726773067.76271: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10299 1726773067.76290: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10299 1726773067.76312: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10299 1726773067.76329: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10299 1726773067.76339: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10299 1726773067.76362: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10299 1726773067.76366: variable 'ansible_host' from source: host vars for 'managed_node3' 10299 1726773067.76373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10299 1726773067.76460: Set connection var ansible_timeout to 10 10299 1726773067.76465: Set connection var ansible_shell_type to sh 10299 1726773067.76471: Set connection var ansible_module_compression to ZIP_DEFLATED 10299 1726773067.76478: Set connection var ansible_shell_executable to /bin/sh 10299 1726773067.76483: Set connection var ansible_pipelining to False 10299 1726773067.76492: Set connection var ansible_connection to ssh 10299 1726773067.76508: variable 'ansible_shell_executable' from source: unknown 10299 1726773067.76512: variable 'ansible_connection' from source: unknown 10299 1726773067.76515: variable 'ansible_module_compression' from source: unknown 10299 1726773067.76519: variable 'ansible_shell_type' from source: unknown 10299 1726773067.76522: variable 'ansible_shell_executable' from source: unknown 10299 1726773067.76525: variable 'ansible_host' from source: host vars for 'managed_node3' 10299 1726773067.76529: variable 'ansible_pipelining' from source: unknown 10299 1726773067.76533: variable 'ansible_timeout' from source: unknown 10299 1726773067.76541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10299 1726773067.76623: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10299 1726773067.76634: variable 'omit' from source: magic vars 10299 1726773067.76640: starting attempt loop 10299 1726773067.76643: running the handler 10299 1726773067.76724: variable 'ansible_facts' from source: unknown 10299 1726773067.76834: _low_level_execute_command(): starting 10299 1726773067.76842: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10299 1726773067.79149: stdout chunk (state=2): >>>/root <<< 10299 1726773067.79270: stderr chunk (state=3): >>><<< 10299 1726773067.79278: stdout chunk (state=3): >>><<< 10299 1726773067.79300: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10299 1726773067.79314: _low_level_execute_command(): starting 10299 1726773067.79320: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773067.7930908-10299-149021143842608 `" && echo ansible-tmp-1726773067.7930908-10299-149021143842608="` echo /root/.ansible/tmp/ansible-tmp-1726773067.7930908-10299-149021143842608 `" ) && sleep 0' 10299 1726773067.81810: stdout chunk (state=2): >>>ansible-tmp-1726773067.7930908-10299-149021143842608=/root/.ansible/tmp/ansible-tmp-1726773067.7930908-10299-149021143842608 <<< 10299 1726773067.81939: stderr chunk (state=3): >>><<< 10299 1726773067.81947: stdout chunk (state=3): >>><<< 10299 1726773067.81966: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773067.7930908-10299-149021143842608=/root/.ansible/tmp/ansible-tmp-1726773067.7930908-10299-149021143842608 , stderr= 10299 1726773067.81996: variable 'ansible_module_compression' from source: unknown 10299 1726773067.82043: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 10299 1726773067.82084: variable 'ansible_facts' from source: unknown 10299 1726773067.82177: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773067.7930908-10299-149021143842608/AnsiballZ_dnf.py 10299 1726773067.82283: Sending initial data 10299 1726773067.82294: Sent initial data (151 bytes) 10299 1726773067.84873: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpw74hwcys /root/.ansible/tmp/ansible-tmp-1726773067.7930908-10299-149021143842608/AnsiballZ_dnf.py <<< 10299 1726773067.86336: stderr chunk (state=3): >>><<< 10299 1726773067.86345: stdout chunk (state=3): >>><<< 10299 1726773067.86365: done transferring module to remote 10299 1726773067.86380: _low_level_execute_command(): starting 10299 1726773067.86387: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773067.7930908-10299-149021143842608/ /root/.ansible/tmp/ansible-tmp-1726773067.7930908-10299-149021143842608/AnsiballZ_dnf.py && sleep 0' 10299 1726773067.88788: stderr chunk (state=2): >>><<< 10299 1726773067.88798: stdout chunk (state=2): >>><<< 10299 1726773067.88813: _low_level_execute_command() done: rc=0, stdout=, stderr= 10299 1726773067.88817: _low_level_execute_command(): starting 10299 1726773067.88822: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773067.7930908-10299-149021143842608/AnsiballZ_dnf.py && sleep 0' 10299 1726773070.43311: stdout chunk (state=2): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} <<< 10299 1726773070.50893: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10299 1726773070.50944: stderr chunk (state=3): >>><<< 10299 1726773070.50952: stdout chunk (state=3): >>><<< 10299 1726773070.50969: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} , stderr=Shared connection to 10.31.47.99 closed. 10299 1726773070.51005: done with _execute_module (ansible.legacy.dnf, {'name': ['python3-configobj'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773067.7930908-10299-149021143842608/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10299 1726773070.51013: _low_level_execute_command(): starting 10299 1726773070.51019: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773067.7930908-10299-149021143842608/ > /dev/null 2>&1 && sleep 0' 10299 1726773070.53474: stderr chunk (state=2): >>><<< 10299 1726773070.53483: stdout chunk (state=2): >>><<< 10299 1726773070.53501: _low_level_execute_command() done: rc=0, stdout=, stderr= 10299 1726773070.53509: handler run complete 10299 1726773070.53533: attempt loop complete, returning result 10299 1726773070.53537: _execute() done 10299 1726773070.53540: dumping result to json 10299 1726773070.53547: done dumping result, returning 10299 1726773070.53554: done running TaskExecutor() for managed_node3/TASK: Ensure python command exists for tests below [0affffe7-6841-7dd6-8fa6-000000000229] 10299 1726773070.53560: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000229 10299 1726773070.53591: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000229 10299 1726773070.53595: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 9733 1726773070.53734: no more pending results, returning what we have 9733 1726773070.53737: results queue empty 9733 1726773070.53737: checking for any_errors_fatal 9733 1726773070.53744: done checking for any_errors_fatal 9733 1726773070.53745: checking for max_fail_percentage 9733 1726773070.53746: done checking for max_fail_percentage 9733 1726773070.53747: checking to see if all hosts have failed and the running result is not ok 9733 1726773070.53747: done checking to see if all hosts have failed 9733 1726773070.53748: getting the remaining hosts for this loop 9733 1726773070.53749: done getting the remaining hosts for this loop 9733 1726773070.53752: getting the next task for host managed_node3 9733 1726773070.53757: done getting next task for host managed_node3 9733 1726773070.53759: ^ task is: TASK: Diff expected vs actual content 9733 1726773070.53762: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773070.53766: getting variables 9733 1726773070.53767: in VariableManager get_vars() 9733 1726773070.53803: Calling all_inventory to load vars for managed_node3 9733 1726773070.53806: Calling groups_inventory to load vars for managed_node3 9733 1726773070.53808: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773070.53817: Calling all_plugins_play to load vars for managed_node3 9733 1726773070.53819: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773070.53822: Calling groups_plugins_play to load vars for managed_node3 9733 1726773070.53948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773070.54065: done with get_vars() 9733 1726773070.54076: done getting variables 9733 1726773070.54121: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Diff expected vs actual content] ***************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings_conf_files.yml:20 Thursday 19 September 2024 15:11:10 -0400 (0:00:02.811) 0:00:16.273 **** 9733 1726773070.54145: entering _queue_task() for managed_node3/shell 9733 1726773070.54327: worker is 1 (out of 1 available) 9733 1726773070.54344: exiting _queue_task() for managed_node3/shell 9733 1726773070.54357: done queuing things up, now waiting for results queue to drain 9733 1726773070.54359: waiting for pending results... 10375 1726773070.54470: running TaskExecutor() for managed_node3/TASK: Diff expected vs actual content 10375 1726773070.54579: in run() - task 0affffe7-6841-7dd6-8fa6-00000000022a 10375 1726773070.54599: variable 'ansible_search_path' from source: unknown 10375 1726773070.54603: variable 'ansible_search_path' from source: unknown 10375 1726773070.54632: calling self._execute() 10375 1726773070.54697: variable 'ansible_host' from source: host vars for 'managed_node3' 10375 1726773070.54706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10375 1726773070.54714: variable 'omit' from source: magic vars 10375 1726773070.54788: variable 'omit' from source: magic vars 10375 1726773070.54824: variable 'omit' from source: magic vars 10375 1726773070.55084: variable '__kernel_settings_test_python_cmd' from source: include_vars 10375 1726773070.55098: variable '__kernel_settings_register_profile_conf_tempfile' from source: set_fact 10375 1726773070.55107: variable '__kernel_settings_profile_filename' from source: role '' exported vars 10375 1726773070.55162: variable '__kernel_settings_profile_dir' from source: role '' exported vars 10375 1726773070.55223: variable '__kernel_settings_profile_parent' from source: set_fact 10375 1726773070.55230: variable '__kernel_settings_tuned_profile' from source: role '' exported vars 10375 1726773070.55313: variable 'omit' from source: magic vars 10375 1726773070.55347: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10375 1726773070.55373: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10375 1726773070.55394: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10375 1726773070.55408: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10375 1726773070.55419: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10375 1726773070.55443: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10375 1726773070.55448: variable 'ansible_host' from source: host vars for 'managed_node3' 10375 1726773070.55452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10375 1726773070.55527: Set connection var ansible_timeout to 10 10375 1726773070.55532: Set connection var ansible_shell_type to sh 10375 1726773070.55538: Set connection var ansible_module_compression to ZIP_DEFLATED 10375 1726773070.55544: Set connection var ansible_shell_executable to /bin/sh 10375 1726773070.55550: Set connection var ansible_pipelining to False 10375 1726773070.55557: Set connection var ansible_connection to ssh 10375 1726773070.55574: variable 'ansible_shell_executable' from source: unknown 10375 1726773070.55578: variable 'ansible_connection' from source: unknown 10375 1726773070.55581: variable 'ansible_module_compression' from source: unknown 10375 1726773070.55586: variable 'ansible_shell_type' from source: unknown 10375 1726773070.55590: variable 'ansible_shell_executable' from source: unknown 10375 1726773070.55594: variable 'ansible_host' from source: host vars for 'managed_node3' 10375 1726773070.55598: variable 'ansible_pipelining' from source: unknown 10375 1726773070.55601: variable 'ansible_timeout' from source: unknown 10375 1726773070.55606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10375 1726773070.55700: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10375 1726773070.55712: variable 'omit' from source: magic vars 10375 1726773070.55719: starting attempt loop 10375 1726773070.55722: running the handler 10375 1726773070.55730: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10375 1726773070.55746: _low_level_execute_command(): starting 10375 1726773070.55754: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10375 1726773070.58055: stdout chunk (state=2): >>>/root <<< 10375 1726773070.58181: stderr chunk (state=3): >>><<< 10375 1726773070.58190: stdout chunk (state=3): >>><<< 10375 1726773070.58211: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10375 1726773070.58225: _low_level_execute_command(): starting 10375 1726773070.58231: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773070.5821972-10375-2660947182919 `" && echo ansible-tmp-1726773070.5821972-10375-2660947182919="` echo /root/.ansible/tmp/ansible-tmp-1726773070.5821972-10375-2660947182919 `" ) && sleep 0' 10375 1726773070.60687: stdout chunk (state=2): >>>ansible-tmp-1726773070.5821972-10375-2660947182919=/root/.ansible/tmp/ansible-tmp-1726773070.5821972-10375-2660947182919 <<< 10375 1726773070.60817: stderr chunk (state=3): >>><<< 10375 1726773070.60823: stdout chunk (state=3): >>><<< 10375 1726773070.60837: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773070.5821972-10375-2660947182919=/root/.ansible/tmp/ansible-tmp-1726773070.5821972-10375-2660947182919 , stderr= 10375 1726773070.60861: variable 'ansible_module_compression' from source: unknown 10375 1726773070.60901: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10375 1726773070.60931: variable 'ansible_facts' from source: unknown 10375 1726773070.61005: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773070.5821972-10375-2660947182919/AnsiballZ_command.py 10375 1726773070.61101: Sending initial data 10375 1726773070.61108: Sent initial data (153 bytes) 10375 1726773070.63675: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpet0zra2m /root/.ansible/tmp/ansible-tmp-1726773070.5821972-10375-2660947182919/AnsiballZ_command.py <<< 10375 1726773070.64851: stderr chunk (state=3): >>><<< 10375 1726773070.64858: stdout chunk (state=3): >>><<< 10375 1726773070.64878: done transferring module to remote 10375 1726773070.64891: _low_level_execute_command(): starting 10375 1726773070.64897: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773070.5821972-10375-2660947182919/ /root/.ansible/tmp/ansible-tmp-1726773070.5821972-10375-2660947182919/AnsiballZ_command.py && sleep 0' 10375 1726773070.67261: stderr chunk (state=2): >>><<< 10375 1726773070.67273: stdout chunk (state=2): >>><<< 10375 1726773070.67291: _low_level_execute_command() done: rc=0, stdout=, stderr= 10375 1726773070.67295: _low_level_execute_command(): starting 10375 1726773070.67300: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773070.5821972-10375-2660947182919/AnsiballZ_command.py && sleep 0' 10375 1726773070.85374: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "/usr/libexec/platform-python -c 'import sys\nfrom configobj import ConfigObj\nc1 = ConfigObj(sys.argv[1])\nc2 = ConfigObj(sys.argv[2])\nsys.exit(not c1 == c2)' /tmp/ansible.ur0ymozm.kernel_settings /etc/tuned/kernel_settings/tuned.conf\n", "start": "2024-09-19 15:11:10.821855", "end": "2024-09-19 15:11:10.851958", "delta": "0:00:00.030103", "msg": "", "invocation": {"module_args": {"_raw_params": "/usr/libexec/platform-python -c 'import sys\nfrom configobj import ConfigObj\nc1 = ConfigObj(sys.argv[1])\nc2 = ConfigObj(sys.argv[2])\nsys.exit(not c1 == c2)' /tmp/ansible.ur0ymozm.kernel_settings /etc/tuned/kernel_settings/tuned.conf\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10375 1726773070.86583: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10375 1726773070.86632: stderr chunk (state=3): >>><<< 10375 1726773070.86638: stdout chunk (state=3): >>><<< 10375 1726773070.86656: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "/usr/libexec/platform-python -c 'import sys\nfrom configobj import ConfigObj\nc1 = ConfigObj(sys.argv[1])\nc2 = ConfigObj(sys.argv[2])\nsys.exit(not c1 == c2)' /tmp/ansible.ur0ymozm.kernel_settings /etc/tuned/kernel_settings/tuned.conf\n", "start": "2024-09-19 15:11:10.821855", "end": "2024-09-19 15:11:10.851958", "delta": "0:00:00.030103", "msg": "", "invocation": {"module_args": {"_raw_params": "/usr/libexec/platform-python -c 'import sys\nfrom configobj import ConfigObj\nc1 = ConfigObj(sys.argv[1])\nc2 = ConfigObj(sys.argv[2])\nsys.exit(not c1 == c2)' /tmp/ansible.ur0ymozm.kernel_settings /etc/tuned/kernel_settings/tuned.conf\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.47.99 closed. 10375 1726773070.86686: done with _execute_module (ansible.legacy.command, {'_raw_params': "/usr/libexec/platform-python -c 'import sys\nfrom configobj import ConfigObj\nc1 = ConfigObj(sys.argv[1])\nc2 = ConfigObj(sys.argv[2])\nsys.exit(not c1 == c2)' /tmp/ansible.ur0ymozm.kernel_settings /etc/tuned/kernel_settings/tuned.conf\n", '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773070.5821972-10375-2660947182919/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10375 1726773070.86697: _low_level_execute_command(): starting 10375 1726773070.86702: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773070.5821972-10375-2660947182919/ > /dev/null 2>&1 && sleep 0' 10375 1726773070.89163: stderr chunk (state=2): >>><<< 10375 1726773070.89173: stdout chunk (state=2): >>><<< 10375 1726773070.89191: _low_level_execute_command() done: rc=0, stdout=, stderr= 10375 1726773070.89198: handler run complete 10375 1726773070.89215: Evaluated conditional (False): False 10375 1726773070.89225: attempt loop complete, returning result 10375 1726773070.89231: _execute() done 10375 1726773070.89234: dumping result to json 10375 1726773070.89239: done dumping result, returning 10375 1726773070.89247: done running TaskExecutor() for managed_node3/TASK: Diff expected vs actual content [0affffe7-6841-7dd6-8fa6-00000000022a] 10375 1726773070.89253: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000022a 10375 1726773070.89283: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000022a 10375 1726773070.89289: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "/usr/libexec/platform-python -c 'import sys\nfrom configobj import ConfigObj\nc1 = ConfigObj(sys.argv[1])\nc2 = ConfigObj(sys.argv[2])\nsys.exit(not c1 == c2)' /tmp/ansible.ur0ymozm.kernel_settings /etc/tuned/kernel_settings/tuned.conf\n", "delta": "0:00:00.030103", "end": "2024-09-19 15:11:10.851958", "rc": 0, "start": "2024-09-19 15:11:10.821855" } 9733 1726773070.89421: no more pending results, returning what we have 9733 1726773070.89424: results queue empty 9733 1726773070.89425: checking for any_errors_fatal 9733 1726773070.89432: done checking for any_errors_fatal 9733 1726773070.89432: checking for max_fail_percentage 9733 1726773070.89434: done checking for max_fail_percentage 9733 1726773070.89434: checking to see if all hosts have failed and the running result is not ok 9733 1726773070.89435: done checking to see if all hosts have failed 9733 1726773070.89435: getting the remaining hosts for this loop 9733 1726773070.89436: done getting the remaining hosts for this loop 9733 1726773070.89439: getting the next task for host managed_node3 9733 1726773070.89445: done getting next task for host managed_node3 9733 1726773070.89447: ^ task is: TASK: Verify expected content 9733 1726773070.89450: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773070.89453: getting variables 9733 1726773070.89454: in VariableManager get_vars() 9733 1726773070.89488: Calling all_inventory to load vars for managed_node3 9733 1726773070.89491: Calling groups_inventory to load vars for managed_node3 9733 1726773070.89493: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773070.89503: Calling all_plugins_play to load vars for managed_node3 9733 1726773070.89505: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773070.89507: Calling groups_plugins_play to load vars for managed_node3 9733 1726773070.89626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773070.89777: done with get_vars() 9733 1726773070.89786: done getting variables 9733 1726773070.89827: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify expected content] ************************************************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings_conf_files.yml:33 Thursday 19 September 2024 15:11:10 -0400 (0:00:00.357) 0:00:16.631 **** 9733 1726773070.89846: entering _queue_task() for managed_node3/set_fact 9733 1726773070.90015: worker is 1 (out of 1 available) 9733 1726773070.90030: exiting _queue_task() for managed_node3/set_fact 9733 1726773070.90043: done queuing things up, now waiting for results queue to drain 9733 1726773070.90045: waiting for pending results... 10383 1726773070.90159: running TaskExecutor() for managed_node3/TASK: Verify expected content 10383 1726773070.90257: in run() - task 0affffe7-6841-7dd6-8fa6-00000000022b 10383 1726773070.90276: variable 'ansible_search_path' from source: unknown 10383 1726773070.90280: variable 'ansible_search_path' from source: unknown 10383 1726773070.90309: calling self._execute() 10383 1726773070.90374: variable 'ansible_host' from source: host vars for 'managed_node3' 10383 1726773070.90382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10383 1726773070.90393: variable 'omit' from source: magic vars 10383 1726773070.90710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10383 1726773070.90884: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10383 1726773070.90919: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10383 1726773070.90944: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10383 1726773070.90966: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10383 1726773070.91049: variable '__kernel_settings_register_profile_conf_result' from source: set_fact 10383 1726773070.91071: Evaluated conditional (__kernel_settings_register_profile_conf_result is failed): False 10383 1726773070.91076: when evaluation is False, skipping this task 10383 1726773070.91079: _execute() done 10383 1726773070.91082: dumping result to json 10383 1726773070.91088: done dumping result, returning 10383 1726773070.91092: done running TaskExecutor() for managed_node3/TASK: Verify expected content [0affffe7-6841-7dd6-8fa6-00000000022b] 10383 1726773070.91096: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000022b 10383 1726773070.91114: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000022b 10383 1726773070.91116: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_register_profile_conf_result is failed", "skip_reason": "Conditional result was False" } 9733 1726773070.91322: no more pending results, returning what we have 9733 1726773070.91324: results queue empty 9733 1726773070.91324: checking for any_errors_fatal 9733 1726773070.91329: done checking for any_errors_fatal 9733 1726773070.91329: checking for max_fail_percentage 9733 1726773070.91330: done checking for max_fail_percentage 9733 1726773070.91331: checking to see if all hosts have failed and the running result is not ok 9733 1726773070.91331: done checking to see if all hosts have failed 9733 1726773070.91332: getting the remaining hosts for this loop 9733 1726773070.91332: done getting the remaining hosts for this loop 9733 1726773070.91334: getting the next task for host managed_node3 9733 1726773070.91338: done getting next task for host managed_node3 9733 1726773070.91340: ^ task is: TASK: Show diff - may not reflect actual ConfigObj differences 9733 1726773070.91342: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773070.91345: getting variables 9733 1726773070.91346: in VariableManager get_vars() 9733 1726773070.91371: Calling all_inventory to load vars for managed_node3 9733 1726773070.91373: Calling groups_inventory to load vars for managed_node3 9733 1726773070.91374: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773070.91381: Calling all_plugins_play to load vars for managed_node3 9733 1726773070.91383: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773070.91384: Calling groups_plugins_play to load vars for managed_node3 9733 1726773070.91484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773070.91599: done with get_vars() 9733 1726773070.91606: done getting variables 9733 1726773070.91644: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show diff - may not reflect actual ConfigObj differences] **************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings_conf_files.yml:38 Thursday 19 September 2024 15:11:10 -0400 (0:00:00.018) 0:00:16.649 **** 9733 1726773070.91664: entering _queue_task() for managed_node3/command 9733 1726773070.91816: worker is 1 (out of 1 available) 9733 1726773070.91833: exiting _queue_task() for managed_node3/command 9733 1726773070.91845: done queuing things up, now waiting for results queue to drain 9733 1726773070.91847: waiting for pending results... 10384 1726773070.91952: running TaskExecutor() for managed_node3/TASK: Show diff - may not reflect actual ConfigObj differences 10384 1726773070.92051: in run() - task 0affffe7-6841-7dd6-8fa6-00000000022c 10384 1726773070.92068: variable 'ansible_search_path' from source: unknown 10384 1726773070.92072: variable 'ansible_search_path' from source: unknown 10384 1726773070.92099: calling self._execute() 10384 1726773070.92161: variable 'ansible_host' from source: host vars for 'managed_node3' 10384 1726773070.92169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10384 1726773070.92178: variable 'omit' from source: magic vars 10384 1726773070.92490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10384 1726773070.92712: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10384 1726773070.92745: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10384 1726773070.92773: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10384 1726773070.92800: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10384 1726773070.92876: variable '__kernel_settings_register_profile_conf_result' from source: set_fact 10384 1726773070.92899: Evaluated conditional (__kernel_settings_register_profile_conf_result is failed): False 10384 1726773070.92903: when evaluation is False, skipping this task 10384 1726773070.92907: _execute() done 10384 1726773070.92911: dumping result to json 10384 1726773070.92914: done dumping result, returning 10384 1726773070.92920: done running TaskExecutor() for managed_node3/TASK: Show diff - may not reflect actual ConfigObj differences [0affffe7-6841-7dd6-8fa6-00000000022c] 10384 1726773070.92926: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000022c 10384 1726773070.92948: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000022c 10384 1726773070.92952: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_register_profile_conf_result is failed", "skip_reason": "Conditional result was False" } 9733 1726773070.93090: no more pending results, returning what we have 9733 1726773070.93093: results queue empty 9733 1726773070.93093: checking for any_errors_fatal 9733 1726773070.93098: done checking for any_errors_fatal 9733 1726773070.93098: checking for max_fail_percentage 9733 1726773070.93100: done checking for max_fail_percentage 9733 1726773070.93100: checking to see if all hosts have failed and the running result is not ok 9733 1726773070.93101: done checking to see if all hosts have failed 9733 1726773070.93101: getting the remaining hosts for this loop 9733 1726773070.93102: done getting the remaining hosts for this loop 9733 1726773070.93105: getting the next task for host managed_node3 9733 1726773070.93110: done getting next task for host managed_node3 9733 1726773070.93112: ^ task is: TASK: Get active_profile file 9733 1726773070.93115: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773070.93118: getting variables 9733 1726773070.93119: in VariableManager get_vars() 9733 1726773070.93147: Calling all_inventory to load vars for managed_node3 9733 1726773070.93149: Calling groups_inventory to load vars for managed_node3 9733 1726773070.93150: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773070.93157: Calling all_plugins_play to load vars for managed_node3 9733 1726773070.93158: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773070.93160: Calling groups_plugins_play to load vars for managed_node3 9733 1726773070.93265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773070.93414: done with get_vars() 9733 1726773070.93422: done getting variables TASK [Get active_profile file] ************************************************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings_conf_files.yml:46 Thursday 19 September 2024 15:11:10 -0400 (0:00:00.018) 0:00:16.667 **** 9733 1726773070.93484: entering _queue_task() for managed_node3/slurp 9733 1726773070.93643: worker is 1 (out of 1 available) 9733 1726773070.93659: exiting _queue_task() for managed_node3/slurp 9733 1726773070.93673: done queuing things up, now waiting for results queue to drain 9733 1726773070.93674: waiting for pending results... 10385 1726773070.93782: running TaskExecutor() for managed_node3/TASK: Get active_profile file 10385 1726773070.93884: in run() - task 0affffe7-6841-7dd6-8fa6-00000000022d 10385 1726773070.93902: variable 'ansible_search_path' from source: unknown 10385 1726773070.93906: variable 'ansible_search_path' from source: unknown 10385 1726773070.93935: calling self._execute() 10385 1726773070.93999: variable 'ansible_host' from source: host vars for 'managed_node3' 10385 1726773070.94007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10385 1726773070.94015: variable 'omit' from source: magic vars 10385 1726773070.94088: variable 'omit' from source: magic vars 10385 1726773070.94123: variable 'omit' from source: magic vars 10385 1726773070.94146: variable 'omit' from source: magic vars 10385 1726773070.94178: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10385 1726773070.94207: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10385 1726773070.94227: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10385 1726773070.94242: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10385 1726773070.94253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10385 1726773070.94277: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10385 1726773070.94283: variable 'ansible_host' from source: host vars for 'managed_node3' 10385 1726773070.94289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10385 1726773070.94359: Set connection var ansible_timeout to 10 10385 1726773070.94364: Set connection var ansible_shell_type to sh 10385 1726773070.94370: Set connection var ansible_module_compression to ZIP_DEFLATED 10385 1726773070.94376: Set connection var ansible_shell_executable to /bin/sh 10385 1726773070.94382: Set connection var ansible_pipelining to False 10385 1726773070.94390: Set connection var ansible_connection to ssh 10385 1726773070.94405: variable 'ansible_shell_executable' from source: unknown 10385 1726773070.94409: variable 'ansible_connection' from source: unknown 10385 1726773070.94412: variable 'ansible_module_compression' from source: unknown 10385 1726773070.94416: variable 'ansible_shell_type' from source: unknown 10385 1726773070.94419: variable 'ansible_shell_executable' from source: unknown 10385 1726773070.94423: variable 'ansible_host' from source: host vars for 'managed_node3' 10385 1726773070.94427: variable 'ansible_pipelining' from source: unknown 10385 1726773070.94430: variable 'ansible_timeout' from source: unknown 10385 1726773070.94434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10385 1726773070.94576: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10385 1726773070.94589: variable 'omit' from source: magic vars 10385 1726773070.94595: starting attempt loop 10385 1726773070.94599: running the handler 10385 1726773070.94610: _low_level_execute_command(): starting 10385 1726773070.94618: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10385 1726773070.97001: stdout chunk (state=2): >>>/root <<< 10385 1726773070.97115: stderr chunk (state=3): >>><<< 10385 1726773070.97122: stdout chunk (state=3): >>><<< 10385 1726773070.97143: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10385 1726773070.97156: _low_level_execute_command(): starting 10385 1726773070.97162: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773070.9715087-10385-12561081721858 `" && echo ansible-tmp-1726773070.9715087-10385-12561081721858="` echo /root/.ansible/tmp/ansible-tmp-1726773070.9715087-10385-12561081721858 `" ) && sleep 0' 10385 1726773070.99643: stdout chunk (state=2): >>>ansible-tmp-1726773070.9715087-10385-12561081721858=/root/.ansible/tmp/ansible-tmp-1726773070.9715087-10385-12561081721858 <<< 10385 1726773070.99776: stderr chunk (state=3): >>><<< 10385 1726773070.99783: stdout chunk (state=3): >>><<< 10385 1726773070.99801: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773070.9715087-10385-12561081721858=/root/.ansible/tmp/ansible-tmp-1726773070.9715087-10385-12561081721858 , stderr= 10385 1726773070.99838: variable 'ansible_module_compression' from source: unknown 10385 1726773070.99879: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.slurp-ZIP_DEFLATED 10385 1726773070.99910: variable 'ansible_facts' from source: unknown 10385 1726773070.99987: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773070.9715087-10385-12561081721858/AnsiballZ_slurp.py 10385 1726773071.00087: Sending initial data 10385 1726773071.00094: Sent initial data (152 bytes) 10385 1726773071.02657: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmp9f78oh30 /root/.ansible/tmp/ansible-tmp-1726773070.9715087-10385-12561081721858/AnsiballZ_slurp.py <<< 10385 1726773071.03794: stderr chunk (state=3): >>><<< 10385 1726773071.03802: stdout chunk (state=3): >>><<< 10385 1726773071.03823: done transferring module to remote 10385 1726773071.03834: _low_level_execute_command(): starting 10385 1726773071.03840: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773070.9715087-10385-12561081721858/ /root/.ansible/tmp/ansible-tmp-1726773070.9715087-10385-12561081721858/AnsiballZ_slurp.py && sleep 0' 10385 1726773071.06228: stderr chunk (state=2): >>><<< 10385 1726773071.06236: stdout chunk (state=2): >>><<< 10385 1726773071.06250: _low_level_execute_command() done: rc=0, stdout=, stderr= 10385 1726773071.06254: _low_level_execute_command(): starting 10385 1726773071.06259: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773070.9715087-10385-12561081721858/AnsiballZ_slurp.py && sleep 0' 10385 1726773071.20842: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"src": "/etc/tuned/active_profile"}}} <<< 10385 1726773071.21813: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10385 1726773071.21862: stderr chunk (state=3): >>><<< 10385 1726773071.21869: stdout chunk (state=3): >>><<< 10385 1726773071.21889: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.47.99 closed. 10385 1726773071.21921: done with _execute_module (slurp, {'src': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773070.9715087-10385-12561081721858/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10385 1726773071.21933: _low_level_execute_command(): starting 10385 1726773071.21940: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773070.9715087-10385-12561081721858/ > /dev/null 2>&1 && sleep 0' 10385 1726773071.24392: stderr chunk (state=2): >>><<< 10385 1726773071.24401: stdout chunk (state=2): >>><<< 10385 1726773071.24415: _low_level_execute_command() done: rc=0, stdout=, stderr= 10385 1726773071.24422: handler run complete 10385 1726773071.24436: attempt loop complete, returning result 10385 1726773071.24439: _execute() done 10385 1726773071.24443: dumping result to json 10385 1726773071.24447: done dumping result, returning 10385 1726773071.24454: done running TaskExecutor() for managed_node3/TASK: Get active_profile file [0affffe7-6841-7dd6-8fa6-00000000022d] 10385 1726773071.24461: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000022d 10385 1726773071.24494: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000022d 10385 1726773071.24498: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "encoding": "base64", "source": "/etc/tuned/active_profile" } 9733 1726773071.24618: no more pending results, returning what we have 9733 1726773071.24621: results queue empty 9733 1726773071.24621: checking for any_errors_fatal 9733 1726773071.24626: done checking for any_errors_fatal 9733 1726773071.24627: checking for max_fail_percentage 9733 1726773071.24628: done checking for max_fail_percentage 9733 1726773071.24629: checking to see if all hosts have failed and the running result is not ok 9733 1726773071.24629: done checking to see if all hosts have failed 9733 1726773071.24630: getting the remaining hosts for this loop 9733 1726773071.24631: done getting the remaining hosts for this loop 9733 1726773071.24634: getting the next task for host managed_node3 9733 1726773071.24638: done getting next task for host managed_node3 9733 1726773071.24640: ^ task is: TASK: Check that active_profile ends with kernel_settings 9733 1726773071.24643: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773071.24646: getting variables 9733 1726773071.24647: in VariableManager get_vars() 9733 1726773071.24680: Calling all_inventory to load vars for managed_node3 9733 1726773071.24683: Calling groups_inventory to load vars for managed_node3 9733 1726773071.24686: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773071.24696: Calling all_plugins_play to load vars for managed_node3 9733 1726773071.24698: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773071.24700: Calling groups_plugins_play to load vars for managed_node3 9733 1726773071.24825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773071.24940: done with get_vars() 9733 1726773071.24948: done getting variables 9733 1726773071.24997: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check that active_profile ends with kernel_settings] ********************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings_conf_files.yml:53 Thursday 19 September 2024 15:11:11 -0400 (0:00:00.315) 0:00:16.982 **** 9733 1726773071.25018: entering _queue_task() for managed_node3/set_fact 9733 1726773071.25192: worker is 1 (out of 1 available) 9733 1726773071.25205: exiting _queue_task() for managed_node3/set_fact 9733 1726773071.25219: done queuing things up, now waiting for results queue to drain 9733 1726773071.25220: waiting for pending results... 10396 1726773071.25332: running TaskExecutor() for managed_node3/TASK: Check that active_profile ends with kernel_settings 10396 1726773071.25438: in run() - task 0affffe7-6841-7dd6-8fa6-00000000022e 10396 1726773071.25454: variable 'ansible_search_path' from source: unknown 10396 1726773071.25458: variable 'ansible_search_path' from source: unknown 10396 1726773071.25489: calling self._execute() 10396 1726773071.25554: variable 'ansible_host' from source: host vars for 'managed_node3' 10396 1726773071.25562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10396 1726773071.25571: variable 'omit' from source: magic vars 10396 1726773071.25920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10396 1726773071.27478: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10396 1726773071.27525: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10396 1726773071.27553: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10396 1726773071.27579: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10396 1726773071.27601: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10396 1726773071.27649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10396 1726773071.27949: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10396 1726773071.27977: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10396 1726773071.28003: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10396 1726773071.28024: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10396 1726773071.28115: variable '__kernel_settings_register_profile_conf_active_profile' from source: set_fact 10396 1726773071.28153: Evaluated conditional (__kernel_settings_register_profile_conf_active_profile.content | b64decode is not search('(^| )kernel_settings$')): False 10396 1726773071.28159: when evaluation is False, skipping this task 10396 1726773071.28162: _execute() done 10396 1726773071.28166: dumping result to json 10396 1726773071.28170: done dumping result, returning 10396 1726773071.28175: done running TaskExecutor() for managed_node3/TASK: Check that active_profile ends with kernel_settings [0affffe7-6841-7dd6-8fa6-00000000022e] 10396 1726773071.28181: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000022e 10396 1726773071.28206: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000022e 10396 1726773071.28209: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_register_profile_conf_active_profile.content | b64decode is not search('(^| )kernel_settings$')", "skip_reason": "Conditional result was False" } 9733 1726773071.28313: no more pending results, returning what we have 9733 1726773071.28316: results queue empty 9733 1726773071.28316: checking for any_errors_fatal 9733 1726773071.28321: done checking for any_errors_fatal 9733 1726773071.28322: checking for max_fail_percentage 9733 1726773071.28323: done checking for max_fail_percentage 9733 1726773071.28323: checking to see if all hosts have failed and the running result is not ok 9733 1726773071.28324: done checking to see if all hosts have failed 9733 1726773071.28325: getting the remaining hosts for this loop 9733 1726773071.28326: done getting the remaining hosts for this loop 9733 1726773071.28328: getting the next task for host managed_node3 9733 1726773071.28334: done getting next task for host managed_node3 9733 1726773071.28336: ^ task is: TASK: Get profile_mode file 9733 1726773071.28339: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773071.28343: getting variables 9733 1726773071.28344: in VariableManager get_vars() 9733 1726773071.28374: Calling all_inventory to load vars for managed_node3 9733 1726773071.28377: Calling groups_inventory to load vars for managed_node3 9733 1726773071.28379: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773071.28389: Calling all_plugins_play to load vars for managed_node3 9733 1726773071.28391: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773071.28395: Calling groups_plugins_play to load vars for managed_node3 9733 1726773071.28558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773071.28672: done with get_vars() 9733 1726773071.28680: done getting variables TASK [Get profile_mode file] *************************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings_conf_files.yml:60 Thursday 19 September 2024 15:11:11 -0400 (0:00:00.037) 0:00:17.019 **** 9733 1726773071.28742: entering _queue_task() for managed_node3/slurp 9733 1726773071.28909: worker is 1 (out of 1 available) 9733 1726773071.28926: exiting _queue_task() for managed_node3/slurp 9733 1726773071.28937: done queuing things up, now waiting for results queue to drain 9733 1726773071.28939: waiting for pending results... 10397 1726773071.29052: running TaskExecutor() for managed_node3/TASK: Get profile_mode file 10397 1726773071.29159: in run() - task 0affffe7-6841-7dd6-8fa6-00000000022f 10397 1726773071.29176: variable 'ansible_search_path' from source: unknown 10397 1726773071.29180: variable 'ansible_search_path' from source: unknown 10397 1726773071.29210: calling self._execute() 10397 1726773071.29275: variable 'ansible_host' from source: host vars for 'managed_node3' 10397 1726773071.29283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10397 1726773071.29293: variable 'omit' from source: magic vars 10397 1726773071.29366: variable 'omit' from source: magic vars 10397 1726773071.29404: variable 'omit' from source: magic vars 10397 1726773071.29425: variable 'omit' from source: magic vars 10397 1726773071.29458: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10397 1726773071.29486: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10397 1726773071.29506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10397 1726773071.29521: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10397 1726773071.29531: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10397 1726773071.29557: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10397 1726773071.29562: variable 'ansible_host' from source: host vars for 'managed_node3' 10397 1726773071.29567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10397 1726773071.29638: Set connection var ansible_timeout to 10 10397 1726773071.29643: Set connection var ansible_shell_type to sh 10397 1726773071.29649: Set connection var ansible_module_compression to ZIP_DEFLATED 10397 1726773071.29654: Set connection var ansible_shell_executable to /bin/sh 10397 1726773071.29659: Set connection var ansible_pipelining to False 10397 1726773071.29664: Set connection var ansible_connection to ssh 10397 1726773071.29678: variable 'ansible_shell_executable' from source: unknown 10397 1726773071.29681: variable 'ansible_connection' from source: unknown 10397 1726773071.29683: variable 'ansible_module_compression' from source: unknown 10397 1726773071.29687: variable 'ansible_shell_type' from source: unknown 10397 1726773071.29689: variable 'ansible_shell_executable' from source: unknown 10397 1726773071.29691: variable 'ansible_host' from source: host vars for 'managed_node3' 10397 1726773071.29693: variable 'ansible_pipelining' from source: unknown 10397 1726773071.29695: variable 'ansible_timeout' from source: unknown 10397 1726773071.29697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10397 1726773071.29834: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10397 1726773071.29844: variable 'omit' from source: magic vars 10397 1726773071.29848: starting attempt loop 10397 1726773071.29850: running the handler 10397 1726773071.29859: _low_level_execute_command(): starting 10397 1726773071.29865: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10397 1726773071.32234: stdout chunk (state=2): >>>/root <<< 10397 1726773071.32358: stderr chunk (state=3): >>><<< 10397 1726773071.32364: stdout chunk (state=3): >>><<< 10397 1726773071.32382: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10397 1726773071.32396: _low_level_execute_command(): starting 10397 1726773071.32403: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773071.3239138-10397-58669186182188 `" && echo ansible-tmp-1726773071.3239138-10397-58669186182188="` echo /root/.ansible/tmp/ansible-tmp-1726773071.3239138-10397-58669186182188 `" ) && sleep 0' 10397 1726773071.34880: stdout chunk (state=2): >>>ansible-tmp-1726773071.3239138-10397-58669186182188=/root/.ansible/tmp/ansible-tmp-1726773071.3239138-10397-58669186182188 <<< 10397 1726773071.35016: stderr chunk (state=3): >>><<< 10397 1726773071.35023: stdout chunk (state=3): >>><<< 10397 1726773071.35040: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773071.3239138-10397-58669186182188=/root/.ansible/tmp/ansible-tmp-1726773071.3239138-10397-58669186182188 , stderr= 10397 1726773071.35080: variable 'ansible_module_compression' from source: unknown 10397 1726773071.35123: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.slurp-ZIP_DEFLATED 10397 1726773071.35153: variable 'ansible_facts' from source: unknown 10397 1726773071.35229: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773071.3239138-10397-58669186182188/AnsiballZ_slurp.py 10397 1726773071.35334: Sending initial data 10397 1726773071.35341: Sent initial data (152 bytes) 10397 1726773071.37879: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmptd3gb6zm /root/.ansible/tmp/ansible-tmp-1726773071.3239138-10397-58669186182188/AnsiballZ_slurp.py <<< 10397 1726773071.39038: stderr chunk (state=3): >>><<< 10397 1726773071.39049: stdout chunk (state=3): >>><<< 10397 1726773071.39067: done transferring module to remote 10397 1726773071.39079: _low_level_execute_command(): starting 10397 1726773071.39084: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773071.3239138-10397-58669186182188/ /root/.ansible/tmp/ansible-tmp-1726773071.3239138-10397-58669186182188/AnsiballZ_slurp.py && sleep 0' 10397 1726773071.41454: stderr chunk (state=2): >>><<< 10397 1726773071.41462: stdout chunk (state=2): >>><<< 10397 1726773071.41477: _low_level_execute_command() done: rc=0, stdout=, stderr= 10397 1726773071.41482: _low_level_execute_command(): starting 10397 1726773071.41488: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773071.3239138-10397-58669186182188/AnsiballZ_slurp.py && sleep 0' 10397 1726773071.56006: stdout chunk (state=2): >>> {"content": "bWFudWFsCg==", "source": "/etc/tuned/profile_mode", "encoding": "base64", "invocation": {"module_args": {"src": "/etc/tuned/profile_mode"}}} <<< 10397 1726773071.56936: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10397 1726773071.56984: stderr chunk (state=3): >>><<< 10397 1726773071.56992: stdout chunk (state=3): >>><<< 10397 1726773071.57010: _low_level_execute_command() done: rc=0, stdout= {"content": "bWFudWFsCg==", "source": "/etc/tuned/profile_mode", "encoding": "base64", "invocation": {"module_args": {"src": "/etc/tuned/profile_mode"}}} , stderr=Shared connection to 10.31.47.99 closed. 10397 1726773071.57040: done with _execute_module (slurp, {'src': '/etc/tuned/profile_mode', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773071.3239138-10397-58669186182188/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10397 1726773071.57051: _low_level_execute_command(): starting 10397 1726773071.57058: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773071.3239138-10397-58669186182188/ > /dev/null 2>&1 && sleep 0' 10397 1726773071.59497: stderr chunk (state=2): >>><<< 10397 1726773071.59506: stdout chunk (state=2): >>><<< 10397 1726773071.59520: _low_level_execute_command() done: rc=0, stdout=, stderr= 10397 1726773071.59527: handler run complete 10397 1726773071.59540: attempt loop complete, returning result 10397 1726773071.59544: _execute() done 10397 1726773071.59548: dumping result to json 10397 1726773071.59552: done dumping result, returning 10397 1726773071.59559: done running TaskExecutor() for managed_node3/TASK: Get profile_mode file [0affffe7-6841-7dd6-8fa6-00000000022f] 10397 1726773071.59565: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000022f 10397 1726773071.59599: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000022f 10397 1726773071.59603: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "content": "bWFudWFsCg==", "encoding": "base64", "source": "/etc/tuned/profile_mode" } 9733 1726773071.59733: no more pending results, returning what we have 9733 1726773071.59736: results queue empty 9733 1726773071.59736: checking for any_errors_fatal 9733 1726773071.59741: done checking for any_errors_fatal 9733 1726773071.59741: checking for max_fail_percentage 9733 1726773071.59743: done checking for max_fail_percentage 9733 1726773071.59743: checking to see if all hosts have failed and the running result is not ok 9733 1726773071.59744: done checking to see if all hosts have failed 9733 1726773071.59745: getting the remaining hosts for this loop 9733 1726773071.59746: done getting the remaining hosts for this loop 9733 1726773071.59749: getting the next task for host managed_node3 9733 1726773071.59755: done getting next task for host managed_node3 9733 1726773071.59757: ^ task is: TASK: Check that profile_mode is manual 9733 1726773071.59760: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773071.59763: getting variables 9733 1726773071.59764: in VariableManager get_vars() 9733 1726773071.59799: Calling all_inventory to load vars for managed_node3 9733 1726773071.59802: Calling groups_inventory to load vars for managed_node3 9733 1726773071.59804: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773071.59814: Calling all_plugins_play to load vars for managed_node3 9733 1726773071.59816: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773071.59819: Calling groups_plugins_play to load vars for managed_node3 9733 1726773071.59935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773071.60050: done with get_vars() 9733 1726773071.60059: done getting variables 9733 1726773071.60105: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check that profile_mode is manual] *************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings_conf_files.yml:67 Thursday 19 September 2024 15:11:11 -0400 (0:00:00.313) 0:00:17.333 **** 9733 1726773071.60127: entering _queue_task() for managed_node3/set_fact 9733 1726773071.60296: worker is 1 (out of 1 available) 9733 1726773071.60313: exiting _queue_task() for managed_node3/set_fact 9733 1726773071.60325: done queuing things up, now waiting for results queue to drain 9733 1726773071.60326: waiting for pending results... 10405 1726773071.60438: running TaskExecutor() for managed_node3/TASK: Check that profile_mode is manual 10405 1726773071.60546: in run() - task 0affffe7-6841-7dd6-8fa6-000000000230 10405 1726773071.60563: variable 'ansible_search_path' from source: unknown 10405 1726773071.60567: variable 'ansible_search_path' from source: unknown 10405 1726773071.60597: calling self._execute() 10405 1726773071.60664: variable 'ansible_host' from source: host vars for 'managed_node3' 10405 1726773071.60672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10405 1726773071.60681: variable 'omit' from source: magic vars 10405 1726773071.61030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10405 1726773071.62802: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10405 1726773071.62848: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10405 1726773071.62876: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10405 1726773071.62905: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10405 1726773071.62927: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10405 1726773071.62978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10405 1726773071.63101: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10405 1726773071.63127: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10405 1726773071.63148: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10405 1726773071.63166: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10405 1726773071.63256: variable '__kernel_settings_register_profile_conf_profile_mode' from source: set_fact 10405 1726773071.63291: Evaluated conditional (__kernel_settings_register_profile_conf_profile_mode.content | b64decode is not match('manual')): False 10405 1726773071.63295: when evaluation is False, skipping this task 10405 1726773071.63298: _execute() done 10405 1726773071.63300: dumping result to json 10405 1726773071.63302: done dumping result, returning 10405 1726773071.63305: done running TaskExecutor() for managed_node3/TASK: Check that profile_mode is manual [0affffe7-6841-7dd6-8fa6-000000000230] 10405 1726773071.63310: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000230 10405 1726773071.63331: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000230 10405 1726773071.63333: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_register_profile_conf_profile_mode.content | b64decode is not match('manual')", "skip_reason": "Conditional result was False" } 9733 1726773071.63609: no more pending results, returning what we have 9733 1726773071.63612: results queue empty 9733 1726773071.63612: checking for any_errors_fatal 9733 1726773071.63617: done checking for any_errors_fatal 9733 1726773071.63618: checking for max_fail_percentage 9733 1726773071.63619: done checking for max_fail_percentage 9733 1726773071.63619: checking to see if all hosts have failed and the running result is not ok 9733 1726773071.63620: done checking to see if all hosts have failed 9733 1726773071.63620: getting the remaining hosts for this loop 9733 1726773071.63621: done getting the remaining hosts for this loop 9733 1726773071.63623: getting the next task for host managed_node3 9733 1726773071.63628: done getting next task for host managed_node3 9733 1726773071.63629: ^ task is: TASK: Get the bootloader specific config file 9733 1726773071.63632: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773071.63634: getting variables 9733 1726773071.63635: in VariableManager get_vars() 9733 1726773071.63659: Calling all_inventory to load vars for managed_node3 9733 1726773071.63661: Calling groups_inventory to load vars for managed_node3 9733 1726773071.63662: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773071.63669: Calling all_plugins_play to load vars for managed_node3 9733 1726773071.63673: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773071.63675: Calling groups_plugins_play to load vars for managed_node3 9733 1726773071.63820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773071.63934: done with get_vars() 9733 1726773071.63942: done getting variables 9733 1726773071.63987: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the bootloader specific config file] ********************************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings_conf_files.yml:74 Thursday 19 September 2024 15:11:11 -0400 (0:00:00.038) 0:00:17.372 **** 9733 1726773071.64009: entering _queue_task() for managed_node3/command 9733 1726773071.64183: worker is 1 (out of 1 available) 9733 1726773071.64200: exiting _queue_task() for managed_node3/command 9733 1726773071.64212: done queuing things up, now waiting for results queue to drain 9733 1726773071.64214: waiting for pending results... 10406 1726773071.64326: running TaskExecutor() for managed_node3/TASK: Get the bootloader specific config file 10406 1726773071.64431: in run() - task 0affffe7-6841-7dd6-8fa6-000000000231 10406 1726773071.64448: variable 'ansible_search_path' from source: unknown 10406 1726773071.64452: variable 'ansible_search_path' from source: unknown 10406 1726773071.64481: calling self._execute() 10406 1726773071.64547: variable 'ansible_host' from source: host vars for 'managed_node3' 10406 1726773071.64553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10406 1726773071.64559: variable 'omit' from source: magic vars 10406 1726773071.64889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10406 1726773071.66630: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10406 1726773071.66681: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10406 1726773071.66711: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10406 1726773071.66746: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10406 1726773071.66764: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10406 1726773071.66831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10406 1726773071.66853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10406 1726773071.66872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10406 1726773071.66903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10406 1726773071.66915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10406 1726773071.67005: Evaluated conditional (__kernel_settings_blcmdline_value | d()): False 10406 1726773071.67010: when evaluation is False, skipping this task 10406 1726773071.67014: _execute() done 10406 1726773071.67017: dumping result to json 10406 1726773071.67021: done dumping result, returning 10406 1726773071.67027: done running TaskExecutor() for managed_node3/TASK: Get the bootloader specific config file [0affffe7-6841-7dd6-8fa6-000000000231] 10406 1726773071.67032: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000231 10406 1726773071.67057: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000231 10406 1726773071.67060: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_blcmdline_value | d()", "skip_reason": "Conditional result was False" } 9733 1726773071.67188: no more pending results, returning what we have 9733 1726773071.67191: results queue empty 9733 1726773071.67192: checking for any_errors_fatal 9733 1726773071.67197: done checking for any_errors_fatal 9733 1726773071.67198: checking for max_fail_percentage 9733 1726773071.67199: done checking for max_fail_percentage 9733 1726773071.67200: checking to see if all hosts have failed and the running result is not ok 9733 1726773071.67201: done checking to see if all hosts have failed 9733 1726773071.67201: getting the remaining hosts for this loop 9733 1726773071.67202: done getting the remaining hosts for this loop 9733 1726773071.67205: getting the next task for host managed_node3 9733 1726773071.67211: done getting next task for host managed_node3 9733 1726773071.67213: ^ task is: TASK: Verify bootloader settings value 9733 1726773071.67215: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773071.67218: getting variables 9733 1726773071.67219: in VariableManager get_vars() 9733 1726773071.67251: Calling all_inventory to load vars for managed_node3 9733 1726773071.67253: Calling groups_inventory to load vars for managed_node3 9733 1726773071.67255: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773071.67264: Calling all_plugins_play to load vars for managed_node3 9733 1726773071.67266: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773071.67268: Calling groups_plugins_play to load vars for managed_node3 9733 1726773071.67394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773071.67510: done with get_vars() 9733 1726773071.67518: done getting variables 9733 1726773071.67560: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify bootloader settings value] **************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings_conf_files.yml:85 Thursday 19 September 2024 15:11:11 -0400 (0:00:00.035) 0:00:17.408 **** 9733 1726773071.67583: entering _queue_task() for managed_node3/set_fact 9733 1726773071.67758: worker is 1 (out of 1 available) 9733 1726773071.67776: exiting _queue_task() for managed_node3/set_fact 9733 1726773071.67791: done queuing things up, now waiting for results queue to drain 9733 1726773071.67793: waiting for pending results... 10407 1726773071.67907: running TaskExecutor() for managed_node3/TASK: Verify bootloader settings value 10407 1726773071.68016: in run() - task 0affffe7-6841-7dd6-8fa6-000000000232 10407 1726773071.68031: variable 'ansible_search_path' from source: unknown 10407 1726773071.68035: variable 'ansible_search_path' from source: unknown 10407 1726773071.68066: calling self._execute() 10407 1726773071.68133: variable 'ansible_host' from source: host vars for 'managed_node3' 10407 1726773071.68142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10407 1726773071.68150: variable 'omit' from source: magic vars 10407 1726773071.68477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10407 1726773071.70176: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10407 1726773071.70224: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10407 1726773071.70253: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10407 1726773071.70282: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10407 1726773071.70305: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10407 1726773071.70360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10407 1726773071.70384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10407 1726773071.70406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10407 1726773071.70433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10407 1726773071.70444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10407 1726773071.70534: Evaluated conditional (__kernel_settings_blcmdline_value | d()): False 10407 1726773071.70540: when evaluation is False, skipping this task 10407 1726773071.70543: _execute() done 10407 1726773071.70547: dumping result to json 10407 1726773071.70551: done dumping result, returning 10407 1726773071.70557: done running TaskExecutor() for managed_node3/TASK: Verify bootloader settings value [0affffe7-6841-7dd6-8fa6-000000000232] 10407 1726773071.70562: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000232 10407 1726773071.70588: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000232 10407 1726773071.70591: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_blcmdline_value | d()", "skip_reason": "Conditional result was False" } 9733 1726773071.70801: no more pending results, returning what we have 9733 1726773071.70804: results queue empty 9733 1726773071.70805: checking for any_errors_fatal 9733 1726773071.70809: done checking for any_errors_fatal 9733 1726773071.70809: checking for max_fail_percentage 9733 1726773071.70811: done checking for max_fail_percentage 9733 1726773071.70811: checking to see if all hosts have failed and the running result is not ok 9733 1726773071.70811: done checking to see if all hosts have failed 9733 1726773071.70812: getting the remaining hosts for this loop 9733 1726773071.70813: done getting the remaining hosts for this loop 9733 1726773071.70815: getting the next task for host managed_node3 9733 1726773071.70820: done getting next task for host managed_node3 9733 1726773071.70822: ^ task is: TASK: Check if kernel_settings_reboot_required is set if needed 9733 1726773071.70824: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773071.70827: getting variables 9733 1726773071.70828: in VariableManager get_vars() 9733 1726773071.70852: Calling all_inventory to load vars for managed_node3 9733 1726773071.70853: Calling groups_inventory to load vars for managed_node3 9733 1726773071.70855: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773071.70862: Calling all_plugins_play to load vars for managed_node3 9733 1726773071.70863: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773071.70865: Calling groups_plugins_play to load vars for managed_node3 9733 1726773071.71017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773071.71131: done with get_vars() 9733 1726773071.71139: done getting variables 9733 1726773071.71181: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check if kernel_settings_reboot_required is set if needed] *************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings.yml:49 Thursday 19 September 2024 15:11:11 -0400 (0:00:00.036) 0:00:17.444 **** 9733 1726773071.71204: entering _queue_task() for managed_node3/set_fact 9733 1726773071.71379: worker is 1 (out of 1 available) 9733 1726773071.71396: exiting _queue_task() for managed_node3/set_fact 9733 1726773071.71409: done queuing things up, now waiting for results queue to drain 9733 1726773071.71411: waiting for pending results... 10408 1726773071.71521: running TaskExecutor() for managed_node3/TASK: Check if kernel_settings_reboot_required is set if needed 10408 1726773071.71623: in run() - task 0affffe7-6841-7dd6-8fa6-00000000018e 10408 1726773071.71638: variable 'ansible_search_path' from source: unknown 10408 1726773071.71643: variable 'ansible_search_path' from source: unknown 10408 1726773071.71670: calling self._execute() 10408 1726773071.71737: variable 'ansible_host' from source: host vars for 'managed_node3' 10408 1726773071.71746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10408 1726773071.71754: variable 'omit' from source: magic vars 10408 1726773071.72086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10408 1726773071.73760: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10408 1726773071.73807: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10408 1726773071.73846: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10408 1726773071.73874: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10408 1726773071.73897: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10408 1726773071.73951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10408 1726773071.73974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10408 1726773071.73995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10408 1726773071.74022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10408 1726773071.74035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10408 1726773071.74121: Evaluated conditional (__kernel_settings_check_reboot | d(false)): False 10408 1726773071.74127: when evaluation is False, skipping this task 10408 1726773071.74130: _execute() done 10408 1726773071.74134: dumping result to json 10408 1726773071.74137: done dumping result, returning 10408 1726773071.74143: done running TaskExecutor() for managed_node3/TASK: Check if kernel_settings_reboot_required is set if needed [0affffe7-6841-7dd6-8fa6-00000000018e] 10408 1726773071.74149: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000018e 10408 1726773071.74171: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000018e 10408 1726773071.74175: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_check_reboot | d(false)", "skip_reason": "Conditional result was False" } 9733 1726773071.74291: no more pending results, returning what we have 9733 1726773071.74295: results queue empty 9733 1726773071.74295: checking for any_errors_fatal 9733 1726773071.74301: done checking for any_errors_fatal 9733 1726773071.74301: checking for max_fail_percentage 9733 1726773071.74302: done checking for max_fail_percentage 9733 1726773071.74303: checking to see if all hosts have failed and the running result is not ok 9733 1726773071.74304: done checking to see if all hosts have failed 9733 1726773071.74304: getting the remaining hosts for this loop 9733 1726773071.74305: done getting the remaining hosts for this loop 9733 1726773071.74308: getting the next task for host managed_node3 9733 1726773071.74313: done getting next task for host managed_node3 9733 1726773071.74315: ^ task is: TASK: Assert success 9733 1726773071.74317: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773071.74320: getting variables 9733 1726773071.74321: in VariableManager get_vars() 9733 1726773071.74351: Calling all_inventory to load vars for managed_node3 9733 1726773071.74354: Calling groups_inventory to load vars for managed_node3 9733 1726773071.74355: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773071.74364: Calling all_plugins_play to load vars for managed_node3 9733 1726773071.74366: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773071.74369: Calling groups_plugins_play to load vars for managed_node3 9733 1726773071.74496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773071.74611: done with get_vars() 9733 1726773071.74619: done getting variables 9733 1726773071.74690: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Assert success] ********************************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings.yml:56 Thursday 19 September 2024 15:11:11 -0400 (0:00:00.035) 0:00:17.479 **** 9733 1726773071.74710: entering _queue_task() for managed_node3/assert 9733 1726773071.74711: Creating lock for assert 9733 1726773071.74887: worker is 1 (out of 1 available) 9733 1726773071.74903: exiting _queue_task() for managed_node3/assert 9733 1726773071.74913: done queuing things up, now waiting for results queue to drain 9733 1726773071.74915: waiting for pending results... 10409 1726773071.75028: running TaskExecutor() for managed_node3/TASK: Assert success 10409 1726773071.75126: in run() - task 0affffe7-6841-7dd6-8fa6-00000000018f 10409 1726773071.75144: variable 'ansible_search_path' from source: unknown 10409 1726773071.75148: variable 'ansible_search_path' from source: unknown 10409 1726773071.75177: calling self._execute() 10409 1726773071.75242: variable 'ansible_host' from source: host vars for 'managed_node3' 10409 1726773071.75251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10409 1726773071.75259: variable 'omit' from source: magic vars 10409 1726773071.75332: variable 'omit' from source: magic vars 10409 1726773071.75358: variable 'omit' from source: magic vars 10409 1726773071.75379: variable 'omit' from source: magic vars 10409 1726773071.75411: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10409 1726773071.75437: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10409 1726773071.75454: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10409 1726773071.75466: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10409 1726773071.75476: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10409 1726773071.75504: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10409 1726773071.75510: variable 'ansible_host' from source: host vars for 'managed_node3' 10409 1726773071.75515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10409 1726773071.75587: Set connection var ansible_timeout to 10 10409 1726773071.75592: Set connection var ansible_shell_type to sh 10409 1726773071.75598: Set connection var ansible_module_compression to ZIP_DEFLATED 10409 1726773071.75604: Set connection var ansible_shell_executable to /bin/sh 10409 1726773071.75610: Set connection var ansible_pipelining to False 10409 1726773071.75616: Set connection var ansible_connection to ssh 10409 1726773071.75631: variable 'ansible_shell_executable' from source: unknown 10409 1726773071.75635: variable 'ansible_connection' from source: unknown 10409 1726773071.75639: variable 'ansible_module_compression' from source: unknown 10409 1726773071.75643: variable 'ansible_shell_type' from source: unknown 10409 1726773071.75646: variable 'ansible_shell_executable' from source: unknown 10409 1726773071.75647: variable 'ansible_host' from source: host vars for 'managed_node3' 10409 1726773071.75650: variable 'ansible_pipelining' from source: unknown 10409 1726773071.75652: variable 'ansible_timeout' from source: unknown 10409 1726773071.75654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10409 1726773071.75744: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10409 1726773071.75754: variable 'omit' from source: magic vars 10409 1726773071.75760: starting attempt loop 10409 1726773071.75762: running the handler 10409 1726773071.76267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10409 1726773071.77769: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10409 1726773071.77818: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10409 1726773071.77847: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10409 1726773071.77872: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10409 1726773071.77903: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10409 1726773071.77953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10409 1726773071.77976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10409 1726773071.77997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10409 1726773071.78023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10409 1726773071.78036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10409 1726773071.78113: variable '__kernel_settings_success' from source: set_fact 10409 1726773071.78129: Evaluated conditional (__kernel_settings_success | d(false)): True 10409 1726773071.78137: handler run complete 10409 1726773071.78153: attempt loop complete, returning result 10409 1726773071.78157: _execute() done 10409 1726773071.78160: dumping result to json 10409 1726773071.78164: done dumping result, returning 10409 1726773071.78171: done running TaskExecutor() for managed_node3/TASK: Assert success [0affffe7-6841-7dd6-8fa6-00000000018f] 10409 1726773071.78177: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000018f 10409 1726773071.78203: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000018f 10409 1726773071.78206: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 9733 1726773071.78325: no more pending results, returning what we have 9733 1726773071.78328: results queue empty 9733 1726773071.78329: checking for any_errors_fatal 9733 1726773071.78333: done checking for any_errors_fatal 9733 1726773071.78334: checking for max_fail_percentage 9733 1726773071.78335: done checking for max_fail_percentage 9733 1726773071.78335: checking to see if all hosts have failed and the running result is not ok 9733 1726773071.78336: done checking to see if all hosts have failed 9733 1726773071.78336: getting the remaining hosts for this loop 9733 1726773071.78338: done getting the remaining hosts for this loop 9733 1726773071.78341: getting the next task for host managed_node3 9733 1726773071.78348: done getting next task for host managed_node3 9733 1726773071.78351: ^ task is: TASK: Check ansible_managed, fingerprint in generated files 9733 1726773071.78352: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773071.78356: getting variables 9733 1726773071.78357: in VariableManager get_vars() 9733 1726773071.78391: Calling all_inventory to load vars for managed_node3 9733 1726773071.78394: Calling groups_inventory to load vars for managed_node3 9733 1726773071.78396: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773071.78405: Calling all_plugins_play to load vars for managed_node3 9733 1726773071.78412: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773071.78415: Calling groups_plugins_play to load vars for managed_node3 9733 1726773071.78737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773071.78845: done with get_vars() 9733 1726773071.78852: done getting variables TASK [Check ansible_managed, fingerprint in generated files] ******************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_simple_settings.yml:35 Thursday 19 September 2024 15:11:11 -0400 (0:00:00.042) 0:00:17.521 **** 9733 1726773071.78917: entering _queue_task() for managed_node3/include_tasks 9733 1726773071.79082: worker is 1 (out of 1 available) 9733 1726773071.79099: exiting _queue_task() for managed_node3/include_tasks 9733 1726773071.79111: done queuing things up, now waiting for results queue to drain 9733 1726773071.79112: waiting for pending results... 10410 1726773071.79227: running TaskExecutor() for managed_node3/TASK: Check ansible_managed, fingerprint in generated files 10410 1726773071.79324: in run() - task 0affffe7-6841-7dd6-8fa6-00000000000a 10410 1726773071.79339: variable 'ansible_search_path' from source: unknown 10410 1726773071.79368: calling self._execute() 10410 1726773071.79438: variable 'ansible_host' from source: host vars for 'managed_node3' 10410 1726773071.79447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10410 1726773071.79455: variable 'omit' from source: magic vars 10410 1726773071.79528: _execute() done 10410 1726773071.79532: dumping result to json 10410 1726773071.79535: done dumping result, returning 10410 1726773071.79538: done running TaskExecutor() for managed_node3/TASK: Check ansible_managed, fingerprint in generated files [0affffe7-6841-7dd6-8fa6-00000000000a] 10410 1726773071.79545: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000000a 10410 1726773071.79568: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000000a 10410 1726773071.79571: WORKER PROCESS EXITING 9733 1726773071.79697: no more pending results, returning what we have 9733 1726773071.79701: in VariableManager get_vars() 9733 1726773071.79732: Calling all_inventory to load vars for managed_node3 9733 1726773071.79734: Calling groups_inventory to load vars for managed_node3 9733 1726773071.79736: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773071.79744: Calling all_plugins_play to load vars for managed_node3 9733 1726773071.79745: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773071.79747: Calling groups_plugins_play to load vars for managed_node3 9733 1726773071.79851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773071.79967: done with get_vars() 9733 1726773071.79974: variable 'ansible_search_path' from source: unknown 9733 1726773071.79983: we have included files to process 9733 1726773071.79984: generating all_blocks data 9733 1726773071.79986: done generating all_blocks data 9733 1726773071.79990: processing included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/check_header.yml 9733 1726773071.79990: loading included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/check_header.yml 9733 1726773071.79992: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/check_header.yml included: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/check_header.yml for managed_node3 9733 1726773071.80216: done processing included file 9733 1726773071.80218: iterating over new_blocks loaded from include file 9733 1726773071.80219: in VariableManager get_vars() 9733 1726773071.80230: done with get_vars() 9733 1726773071.80231: filtering new block on tags 9733 1726773071.80249: done filtering new block on tags 9733 1726773071.80251: done iterating over new_blocks loaded from include file 9733 1726773071.80251: extending task lists for all hosts with included blocks 9733 1726773071.81125: done extending task lists 9733 1726773071.81126: done processing included files 9733 1726773071.81126: results queue empty 9733 1726773071.81127: checking for any_errors_fatal 9733 1726773071.81129: done checking for any_errors_fatal 9733 1726773071.81130: checking for max_fail_percentage 9733 1726773071.81130: done checking for max_fail_percentage 9733 1726773071.81131: checking to see if all hosts have failed and the running result is not ok 9733 1726773071.81131: done checking to see if all hosts have failed 9733 1726773071.81132: getting the remaining hosts for this loop 9733 1726773071.81132: done getting the remaining hosts for this loop 9733 1726773071.81133: getting the next task for host managed_node3 9733 1726773071.81136: done getting next task for host managed_node3 9733 1726773071.81137: ^ task is: TASK: Get file 9733 1726773071.81139: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773071.81140: getting variables 9733 1726773071.81141: in VariableManager get_vars() 9733 1726773071.81149: Calling all_inventory to load vars for managed_node3 9733 1726773071.81150: Calling groups_inventory to load vars for managed_node3 9733 1726773071.81151: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773071.81155: Calling all_plugins_play to load vars for managed_node3 9733 1726773071.81157: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773071.81158: Calling groups_plugins_play to load vars for managed_node3 9733 1726773071.81254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773071.81363: done with get_vars() 9733 1726773071.81373: done getting variables TASK [Get file] **************************************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/check_header.yml:3 Thursday 19 September 2024 15:11:11 -0400 (0:00:00.025) 0:00:17.546 **** 9733 1726773071.81421: entering _queue_task() for managed_node3/slurp 9733 1726773071.81609: worker is 1 (out of 1 available) 9733 1726773071.81623: exiting _queue_task() for managed_node3/slurp 9733 1726773071.81634: done queuing things up, now waiting for results queue to drain 9733 1726773071.81636: waiting for pending results... 10411 1726773071.81749: running TaskExecutor() for managed_node3/TASK: Get file 10411 1726773071.81848: in run() - task 0affffe7-6841-7dd6-8fa6-00000000028a 10411 1726773071.81863: variable 'ansible_search_path' from source: unknown 10411 1726773071.81867: variable 'ansible_search_path' from source: unknown 10411 1726773071.81898: calling self._execute() 10411 1726773071.81963: variable 'ansible_host' from source: host vars for 'managed_node3' 10411 1726773071.81972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10411 1726773071.81980: variable 'omit' from source: magic vars 10411 1726773071.82303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10411 1726773071.82481: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10411 1726773071.82517: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10411 1726773071.82543: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10411 1726773071.82571: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10411 1726773071.82632: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10411 1726773071.82650: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10411 1726773071.82665: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10411 1726773071.82681: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10411 1726773071.82784: Evaluated conditional (not __file_content is defined): True 10411 1726773071.82793: variable 'omit' from source: magic vars 10411 1726773071.82820: variable 'omit' from source: magic vars 10411 1726773071.82838: variable '__file' from source: include params 10411 1726773071.82896: variable '__file' from source: include params 10411 1726773071.82904: variable '__kernel_settings_profile_filename' from source: role '' exported vars 10411 1726773071.82947: variable '__kernel_settings_profile_filename' from source: role '' exported vars 10411 1726773071.82998: variable '__kernel_settings_profile_dir' from source: role '' exported vars 10411 1726773071.83053: variable '__kernel_settings_profile_parent' from source: set_fact 10411 1726773071.83062: variable '__kernel_settings_tuned_profile' from source: role '' exported vars 10411 1726773071.83116: variable 'omit' from source: magic vars 10411 1726773071.83136: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10411 1726773071.83157: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10411 1726773071.83174: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10411 1726773071.83190: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10411 1726773071.83201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10411 1726773071.83223: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10411 1726773071.83228: variable 'ansible_host' from source: host vars for 'managed_node3' 10411 1726773071.83232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10411 1726773071.83299: Set connection var ansible_timeout to 10 10411 1726773071.83304: Set connection var ansible_shell_type to sh 10411 1726773071.83310: Set connection var ansible_module_compression to ZIP_DEFLATED 10411 1726773071.83315: Set connection var ansible_shell_executable to /bin/sh 10411 1726773071.83321: Set connection var ansible_pipelining to False 10411 1726773071.83327: Set connection var ansible_connection to ssh 10411 1726773071.83342: variable 'ansible_shell_executable' from source: unknown 10411 1726773071.83345: variable 'ansible_connection' from source: unknown 10411 1726773071.83348: variable 'ansible_module_compression' from source: unknown 10411 1726773071.83352: variable 'ansible_shell_type' from source: unknown 10411 1726773071.83355: variable 'ansible_shell_executable' from source: unknown 10411 1726773071.83359: variable 'ansible_host' from source: host vars for 'managed_node3' 10411 1726773071.83363: variable 'ansible_pipelining' from source: unknown 10411 1726773071.83366: variable 'ansible_timeout' from source: unknown 10411 1726773071.83370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10411 1726773071.83458: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10411 1726773071.83469: variable 'omit' from source: magic vars 10411 1726773071.83476: starting attempt loop 10411 1726773071.83480: running the handler 10411 1726773071.83493: _low_level_execute_command(): starting 10411 1726773071.83501: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10411 1726773071.85856: stdout chunk (state=2): >>>/root <<< 10411 1726773071.85988: stderr chunk (state=3): >>><<< 10411 1726773071.85995: stdout chunk (state=3): >>><<< 10411 1726773071.86012: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10411 1726773071.86024: _low_level_execute_command(): starting 10411 1726773071.86030: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773071.8601942-10411-91795833323844 `" && echo ansible-tmp-1726773071.8601942-10411-91795833323844="` echo /root/.ansible/tmp/ansible-tmp-1726773071.8601942-10411-91795833323844 `" ) && sleep 0' 10411 1726773071.88505: stdout chunk (state=2): >>>ansible-tmp-1726773071.8601942-10411-91795833323844=/root/.ansible/tmp/ansible-tmp-1726773071.8601942-10411-91795833323844 <<< 10411 1726773071.88642: stderr chunk (state=3): >>><<< 10411 1726773071.88650: stdout chunk (state=3): >>><<< 10411 1726773071.88665: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773071.8601942-10411-91795833323844=/root/.ansible/tmp/ansible-tmp-1726773071.8601942-10411-91795833323844 , stderr= 10411 1726773071.88705: variable 'ansible_module_compression' from source: unknown 10411 1726773071.88740: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.slurp-ZIP_DEFLATED 10411 1726773071.88771: variable 'ansible_facts' from source: unknown 10411 1726773071.88843: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773071.8601942-10411-91795833323844/AnsiballZ_slurp.py 10411 1726773071.88947: Sending initial data 10411 1726773071.88954: Sent initial data (152 bytes) 10411 1726773071.91482: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpi5z55gk8 /root/.ansible/tmp/ansible-tmp-1726773071.8601942-10411-91795833323844/AnsiballZ_slurp.py <<< 10411 1726773071.92623: stderr chunk (state=3): >>><<< 10411 1726773071.92633: stdout chunk (state=3): >>><<< 10411 1726773071.92653: done transferring module to remote 10411 1726773071.92662: _low_level_execute_command(): starting 10411 1726773071.92666: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773071.8601942-10411-91795833323844/ /root/.ansible/tmp/ansible-tmp-1726773071.8601942-10411-91795833323844/AnsiballZ_slurp.py && sleep 0' 10411 1726773071.95022: stderr chunk (state=2): >>><<< 10411 1726773071.95029: stdout chunk (state=2): >>><<< 10411 1726773071.95043: _low_level_execute_command() done: rc=0, stdout=, stderr= 10411 1726773071.95047: _low_level_execute_command(): starting 10411 1726773071.95052: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773071.8601942-10411-91795833323844/AnsiballZ_slurp.py && sleep 0' 10411 1726773072.09788: stdout chunk (state=2): >>> {"content": "IwojIEFuc2libGUgbWFuYWdlZAojCiMgc3lzdGVtX3JvbGU6a2VybmVsX3NldHRpbmdzCgpbbWFpbl0Kc3VtbWFyeSA9IGtlcm5lbCBzZXR0aW5ncwpbc3lzY3RsXQpmcy5lcG9sbC5tYXhfdXNlcl93YXRjaGVzID0gNzg1NTkyCmZzLmZpbGUtbWF4ID0gMzc5NzI0CltzeXNmc10KL3N5cy9rZXJuZWwvZGVidWcveDg2L2licnNfZW5hYmxlZCA9IDAKL3N5cy9rZXJuZWwvZGVidWcveDg2L3B0aV9lbmFibGVkID0gMAovc3lzL2tlcm5lbC9kZWJ1Zy94ODYvcmV0cF9lbmFibGVkID0gMApbdm1dCnRyYW5zcGFyZW50X2h1Z2VwYWdlcyA9IG1hZHZpc2UK", "source": "/etc/tuned/kernel_settings/tuned.conf", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "src": "/etc/tuned/kernel_settings/tuned.conf"}}} <<< 10411 1726773072.10780: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10411 1726773072.10827: stderr chunk (state=3): >>><<< 10411 1726773072.10834: stdout chunk (state=3): >>><<< 10411 1726773072.10851: _low_level_execute_command() done: rc=0, stdout= {"content": "IwojIEFuc2libGUgbWFuYWdlZAojCiMgc3lzdGVtX3JvbGU6a2VybmVsX3NldHRpbmdzCgpbbWFpbl0Kc3VtbWFyeSA9IGtlcm5lbCBzZXR0aW5ncwpbc3lzY3RsXQpmcy5lcG9sbC5tYXhfdXNlcl93YXRjaGVzID0gNzg1NTkyCmZzLmZpbGUtbWF4ID0gMzc5NzI0CltzeXNmc10KL3N5cy9rZXJuZWwvZGVidWcveDg2L2licnNfZW5hYmxlZCA9IDAKL3N5cy9rZXJuZWwvZGVidWcveDg2L3B0aV9lbmFibGVkID0gMAovc3lzL2tlcm5lbC9kZWJ1Zy94ODYvcmV0cF9lbmFibGVkID0gMApbdm1dCnRyYW5zcGFyZW50X2h1Z2VwYWdlcyA9IG1hZHZpc2UK", "source": "/etc/tuned/kernel_settings/tuned.conf", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "src": "/etc/tuned/kernel_settings/tuned.conf"}}} , stderr=Shared connection to 10.31.47.99 closed. 10411 1726773072.10874: done with _execute_module (slurp, {'path': '/etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773071.8601942-10411-91795833323844/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10411 1726773072.10887: _low_level_execute_command(): starting 10411 1726773072.10894: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773071.8601942-10411-91795833323844/ > /dev/null 2>&1 && sleep 0' 10411 1726773072.13342: stderr chunk (state=2): >>><<< 10411 1726773072.13354: stdout chunk (state=2): >>><<< 10411 1726773072.13372: _low_level_execute_command() done: rc=0, stdout=, stderr= 10411 1726773072.13380: handler run complete 10411 1726773072.13395: attempt loop complete, returning result 10411 1726773072.13398: _execute() done 10411 1726773072.13403: dumping result to json 10411 1726773072.13407: done dumping result, returning 10411 1726773072.13414: done running TaskExecutor() for managed_node3/TASK: Get file [0affffe7-6841-7dd6-8fa6-00000000028a] 10411 1726773072.13420: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000028a 10411 1726773072.13447: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000028a 10411 1726773072.13450: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "content": "IwojIEFuc2libGUgbWFuYWdlZAojCiMgc3lzdGVtX3JvbGU6a2VybmVsX3NldHRpbmdzCgpbbWFpbl0Kc3VtbWFyeSA9IGtlcm5lbCBzZXR0aW5ncwpbc3lzY3RsXQpmcy5lcG9sbC5tYXhfdXNlcl93YXRjaGVzID0gNzg1NTkyCmZzLmZpbGUtbWF4ID0gMzc5NzI0CltzeXNmc10KL3N5cy9rZXJuZWwvZGVidWcveDg2L2licnNfZW5hYmxlZCA9IDAKL3N5cy9rZXJuZWwvZGVidWcveDg2L3B0aV9lbmFibGVkID0gMAovc3lzL2tlcm5lbC9kZWJ1Zy94ODYvcmV0cF9lbmFibGVkID0gMApbdm1dCnRyYW5zcGFyZW50X2h1Z2VwYWdlcyA9IG1hZHZpc2UK", "encoding": "base64", "source": "/etc/tuned/kernel_settings/tuned.conf" } 9733 1726773072.13575: no more pending results, returning what we have 9733 1726773072.13578: results queue empty 9733 1726773072.13578: checking for any_errors_fatal 9733 1726773072.13580: done checking for any_errors_fatal 9733 1726773072.13580: checking for max_fail_percentage 9733 1726773072.13582: done checking for max_fail_percentage 9733 1726773072.13582: checking to see if all hosts have failed and the running result is not ok 9733 1726773072.13583: done checking to see if all hosts have failed 9733 1726773072.13583: getting the remaining hosts for this loop 9733 1726773072.13584: done getting the remaining hosts for this loop 9733 1726773072.13589: getting the next task for host managed_node3 9733 1726773072.13594: done getting next task for host managed_node3 9733 1726773072.13596: ^ task is: TASK: Check for presence of ansible managed header, fingerprint 9733 1726773072.13598: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773072.13601: getting variables 9733 1726773072.13602: in VariableManager get_vars() 9733 1726773072.13633: Calling all_inventory to load vars for managed_node3 9733 1726773072.13635: Calling groups_inventory to load vars for managed_node3 9733 1726773072.13637: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773072.13647: Calling all_plugins_play to load vars for managed_node3 9733 1726773072.13649: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773072.13652: Calling groups_plugins_play to load vars for managed_node3 9733 1726773072.13789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773072.13913: done with get_vars() 9733 1726773072.13921: done getting variables 9733 1726773072.13964: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check for presence of ansible managed header, fingerprint] *************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/check_header.yml:9 Thursday 19 September 2024 15:11:12 -0400 (0:00:00.325) 0:00:17.872 **** 9733 1726773072.13984: entering _queue_task() for managed_node3/assert 9733 1726773072.14157: worker is 1 (out of 1 available) 9733 1726773072.14172: exiting _queue_task() for managed_node3/assert 9733 1726773072.14183: done queuing things up, now waiting for results queue to drain 9733 1726773072.14188: waiting for pending results... 10422 1726773072.14307: running TaskExecutor() for managed_node3/TASK: Check for presence of ansible managed header, fingerprint 10422 1726773072.14411: in run() - task 0affffe7-6841-7dd6-8fa6-00000000028b 10422 1726773072.14427: variable 'ansible_search_path' from source: unknown 10422 1726773072.14432: variable 'ansible_search_path' from source: unknown 10422 1726773072.14459: calling self._execute() 10422 1726773072.14534: variable 'ansible_host' from source: host vars for 'managed_node3' 10422 1726773072.14543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10422 1726773072.14552: variable 'omit' from source: magic vars 10422 1726773072.14629: variable 'omit' from source: magic vars 10422 1726773072.14659: variable 'omit' from source: magic vars 10422 1726773072.14684: variable 'omit' from source: magic vars 10422 1726773072.14719: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10422 1726773072.14748: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10422 1726773072.14765: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10422 1726773072.14779: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10422 1726773072.14791: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10422 1726773072.14813: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10422 1726773072.14816: variable 'ansible_host' from source: host vars for 'managed_node3' 10422 1726773072.14819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10422 1726773072.14887: Set connection var ansible_timeout to 10 10422 1726773072.14891: Set connection var ansible_shell_type to sh 10422 1726773072.14895: Set connection var ansible_module_compression to ZIP_DEFLATED 10422 1726773072.14898: Set connection var ansible_shell_executable to /bin/sh 10422 1726773072.14901: Set connection var ansible_pipelining to False 10422 1726773072.14905: Set connection var ansible_connection to ssh 10422 1726773072.14917: variable 'ansible_shell_executable' from source: unknown 10422 1726773072.14920: variable 'ansible_connection' from source: unknown 10422 1726773072.14922: variable 'ansible_module_compression' from source: unknown 10422 1726773072.14924: variable 'ansible_shell_type' from source: unknown 10422 1726773072.14926: variable 'ansible_shell_executable' from source: unknown 10422 1726773072.14927: variable 'ansible_host' from source: host vars for 'managed_node3' 10422 1726773072.14929: variable 'ansible_pipelining' from source: unknown 10422 1726773072.14931: variable 'ansible_timeout' from source: unknown 10422 1726773072.14933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10422 1726773072.15025: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10422 1726773072.15035: variable 'omit' from source: magic vars 10422 1726773072.15040: starting attempt loop 10422 1726773072.15042: running the handler 10422 1726773072.15373: variable 'ansible_managed' from source: task vars 10422 1726773072.15531: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 10422 1726773072.15688: Loaded config def from plugin (lookup/template) 10422 1726773072.15695: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 10422 1726773072.15713: File lookup term: get_ansible_managed.j2 10422 1726773072.15717: variable 'ansible_search_path' from source: unknown 10422 1726773072.15722: evaluation_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings 10422 1726773072.15747: search_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/templates/get_ansible_managed.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/get_ansible_managed.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/templates/get_ansible_managed.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/get_ansible_managed.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/templates/get_ansible_managed.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/get_ansible_managed.j2 10422 1726773072.15766: variable 'ansible_search_path' from source: unknown 10422 1726773072.16441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10422 1726773072.17805: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10422 1726773072.17856: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10422 1726773072.17888: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10422 1726773072.17914: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10422 1726773072.17934: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10422 1726773072.17989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10422 1726773072.18010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10422 1726773072.18028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10422 1726773072.18055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10422 1726773072.18066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10422 1726773072.18143: variable 'ansible_managed' from source: task vars 10422 1726773072.18183: variable 'content' from source: task vars 10422 1726773072.18280: variable '__content' from source: set_fact 10422 1726773072.18302: Evaluated conditional (ansible_managed in content): True 10422 1726773072.18377: variable '__fingerprint' from source: include params 10422 1726773072.18383: variable 'content' from source: task vars 10422 1726773072.18477: variable '__content' from source: set_fact 10422 1726773072.18493: Evaluated conditional (__fingerprint in content): True 10422 1726773072.18500: handler run complete 10422 1726773072.18511: attempt loop complete, returning result 10422 1726773072.18515: _execute() done 10422 1726773072.18518: dumping result to json 10422 1726773072.18522: done dumping result, returning 10422 1726773072.18528: done running TaskExecutor() for managed_node3/TASK: Check for presence of ansible managed header, fingerprint [0affffe7-6841-7dd6-8fa6-00000000028b] 10422 1726773072.18534: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000028b 10422 1726773072.18557: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000028b 10422 1726773072.18561: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 9733 1726773072.18681: no more pending results, returning what we have 9733 1726773072.18684: results queue empty 9733 1726773072.18684: checking for any_errors_fatal 9733 1726773072.18693: done checking for any_errors_fatal 9733 1726773072.18694: checking for max_fail_percentage 9733 1726773072.18695: done checking for max_fail_percentage 9733 1726773072.18696: checking to see if all hosts have failed and the running result is not ok 9733 1726773072.18696: done checking to see if all hosts have failed 9733 1726773072.18697: getting the remaining hosts for this loop 9733 1726773072.18698: done getting the remaining hosts for this loop 9733 1726773072.18701: getting the next task for host managed_node3 9733 1726773072.18707: done getting next task for host managed_node3 9733 1726773072.18709: ^ task is: TASK: Ensure role reported changed 9733 1726773072.18710: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773072.18713: getting variables 9733 1726773072.18714: in VariableManager get_vars() 9733 1726773072.18745: Calling all_inventory to load vars for managed_node3 9733 1726773072.18747: Calling groups_inventory to load vars for managed_node3 9733 1726773072.18749: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773072.18758: Calling all_plugins_play to load vars for managed_node3 9733 1726773072.18760: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773072.18763: Calling groups_plugins_play to load vars for managed_node3 9733 1726773072.18922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773072.19031: done with get_vars() 9733 1726773072.19040: done getting variables 9733 1726773072.19083: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure role reported changed] ******************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_simple_settings.yml:41 Thursday 19 September 2024 15:11:12 -0400 (0:00:00.051) 0:00:17.923 **** 9733 1726773072.19103: entering _queue_task() for managed_node3/assert 9733 1726773072.19267: worker is 1 (out of 1 available) 9733 1726773072.19283: exiting _queue_task() for managed_node3/assert 9733 1726773072.19296: done queuing things up, now waiting for results queue to drain 9733 1726773072.19298: waiting for pending results... 10423 1726773072.19411: running TaskExecutor() for managed_node3/TASK: Ensure role reported changed 10423 1726773072.19500: in run() - task 0affffe7-6841-7dd6-8fa6-00000000000b 10423 1726773072.19515: variable 'ansible_search_path' from source: unknown 10423 1726773072.19545: calling self._execute() 10423 1726773072.19616: variable 'ansible_host' from source: host vars for 'managed_node3' 10423 1726773072.19624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10423 1726773072.19631: variable 'omit' from source: magic vars 10423 1726773072.19702: variable 'omit' from source: magic vars 10423 1726773072.19725: variable 'omit' from source: magic vars 10423 1726773072.19746: variable 'omit' from source: magic vars 10423 1726773072.19775: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10423 1726773072.19807: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10423 1726773072.19830: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10423 1726773072.19845: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10423 1726773072.19857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10423 1726773072.19881: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10423 1726773072.19887: variable 'ansible_host' from source: host vars for 'managed_node3' 10423 1726773072.19892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10423 1726773072.19961: Set connection var ansible_timeout to 10 10423 1726773072.19966: Set connection var ansible_shell_type to sh 10423 1726773072.19972: Set connection var ansible_module_compression to ZIP_DEFLATED 10423 1726773072.19977: Set connection var ansible_shell_executable to /bin/sh 10423 1726773072.19983: Set connection var ansible_pipelining to False 10423 1726773072.19990: Set connection var ansible_connection to ssh 10423 1726773072.20005: variable 'ansible_shell_executable' from source: unknown 10423 1726773072.20009: variable 'ansible_connection' from source: unknown 10423 1726773072.20013: variable 'ansible_module_compression' from source: unknown 10423 1726773072.20016: variable 'ansible_shell_type' from source: unknown 10423 1726773072.20020: variable 'ansible_shell_executable' from source: unknown 10423 1726773072.20023: variable 'ansible_host' from source: host vars for 'managed_node3' 10423 1726773072.20028: variable 'ansible_pipelining' from source: unknown 10423 1726773072.20031: variable 'ansible_timeout' from source: unknown 10423 1726773072.20035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10423 1726773072.20126: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10423 1726773072.20137: variable 'omit' from source: magic vars 10423 1726773072.20144: starting attempt loop 10423 1726773072.20147: running the handler 10423 1726773072.20394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10423 1726773072.21954: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10423 1726773072.22003: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10423 1726773072.22031: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10423 1726773072.22059: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10423 1726773072.22080: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10423 1726773072.22130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10423 1726773072.22150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10423 1726773072.22169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10423 1726773072.22200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10423 1726773072.22213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10423 1726773072.22289: variable '__kernel_settings_changed' from source: set_fact 10423 1726773072.22305: Evaluated conditional (__kernel_settings_changed | d(false)): True 10423 1726773072.22312: handler run complete 10423 1726773072.22328: attempt loop complete, returning result 10423 1726773072.22332: _execute() done 10423 1726773072.22336: dumping result to json 10423 1726773072.22339: done dumping result, returning 10423 1726773072.22347: done running TaskExecutor() for managed_node3/TASK: Ensure role reported changed [0affffe7-6841-7dd6-8fa6-00000000000b] 10423 1726773072.22352: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000000b 10423 1726773072.22375: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000000b 10423 1726773072.22379: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 9733 1726773072.22486: no more pending results, returning what we have 9733 1726773072.22489: results queue empty 9733 1726773072.22490: checking for any_errors_fatal 9733 1726773072.22495: done checking for any_errors_fatal 9733 1726773072.22496: checking for max_fail_percentage 9733 1726773072.22498: done checking for max_fail_percentage 9733 1726773072.22498: checking to see if all hosts have failed and the running result is not ok 9733 1726773072.22499: done checking to see if all hosts have failed 9733 1726773072.22499: getting the remaining hosts for this loop 9733 1726773072.22500: done getting the remaining hosts for this loop 9733 1726773072.22503: getting the next task for host managed_node3 9733 1726773072.22507: done getting next task for host managed_node3 9733 1726773072.22509: ^ task is: TASK: Reboot the machine - see if settings persist after reboot 9733 1726773072.22511: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773072.22513: getting variables 9733 1726773072.22514: in VariableManager get_vars() 9733 1726773072.22543: Calling all_inventory to load vars for managed_node3 9733 1726773072.22545: Calling groups_inventory to load vars for managed_node3 9733 1726773072.22547: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773072.22557: Calling all_plugins_play to load vars for managed_node3 9733 1726773072.22560: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773072.22568: Calling groups_plugins_play to load vars for managed_node3 9733 1726773072.22694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773072.22814: done with get_vars() 9733 1726773072.22822: done getting variables 9733 1726773072.22861: Loading ActionModule 'reboot' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Reboot the machine - see if settings persist after reboot] *************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_simple_settings.yml:47 Thursday 19 September 2024 15:11:12 -0400 (0:00:00.037) 0:00:17.961 **** 9733 1726773072.22883: entering _queue_task() for managed_node3/reboot 9733 1726773072.23044: worker is 1 (out of 1 available) 9733 1726773072.23057: exiting _queue_task() for managed_node3/reboot 9733 1726773072.23068: done queuing things up, now waiting for results queue to drain 9733 1726773072.23072: waiting for pending results... 10424 1726773072.23190: running TaskExecutor() for managed_node3/TASK: Reboot the machine - see if settings persist after reboot 10424 1726773072.23279: in run() - task 0affffe7-6841-7dd6-8fa6-00000000000c 10424 1726773072.23294: variable 'ansible_search_path' from source: unknown 10424 1726773072.23323: calling self._execute() 10424 1726773072.23391: variable 'ansible_host' from source: host vars for 'managed_node3' 10424 1726773072.23398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10424 1726773072.23404: variable 'omit' from source: magic vars 10424 1726773072.23789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10424 1726773072.25235: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10424 1726773072.25283: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10424 1726773072.25313: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10424 1726773072.25340: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10424 1726773072.25370: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10424 1726773072.25426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10424 1726773072.25448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10424 1726773072.25467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10424 1726773072.25499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10424 1726773072.25510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10424 1726773072.25590: variable 'kernel_settings_reboot_required' from source: set_fact 10424 1726773072.25606: Evaluated conditional (kernel_settings_reboot_required | d(false)): False 10424 1726773072.25611: when evaluation is False, skipping this task 10424 1726773072.25614: _execute() done 10424 1726773072.25618: dumping result to json 10424 1726773072.25622: done dumping result, returning 10424 1726773072.25628: done running TaskExecutor() for managed_node3/TASK: Reboot the machine - see if settings persist after reboot [0affffe7-6841-7dd6-8fa6-00000000000c] 10424 1726773072.25633: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000000c 10424 1726773072.25655: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000000c 10424 1726773072.25658: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "kernel_settings_reboot_required | d(false)", "skip_reason": "Conditional result was False" } 9733 1726773072.25760: no more pending results, returning what we have 9733 1726773072.25762: results queue empty 9733 1726773072.25763: checking for any_errors_fatal 9733 1726773072.25768: done checking for any_errors_fatal 9733 1726773072.25768: checking for max_fail_percentage 9733 1726773072.25769: done checking for max_fail_percentage 9733 1726773072.25772: checking to see if all hosts have failed and the running result is not ok 9733 1726773072.25773: done checking to see if all hosts have failed 9733 1726773072.25773: getting the remaining hosts for this loop 9733 1726773072.25774: done getting the remaining hosts for this loop 9733 1726773072.25777: getting the next task for host managed_node3 9733 1726773072.25781: done getting next task for host managed_node3 9733 1726773072.25783: ^ task is: TASK: Verify that settings were applied correctly after reboot 9733 1726773072.25786: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773072.25790: getting variables 9733 1726773072.25791: in VariableManager get_vars() 9733 1726773072.25819: Calling all_inventory to load vars for managed_node3 9733 1726773072.25822: Calling groups_inventory to load vars for managed_node3 9733 1726773072.25823: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773072.25834: Calling all_plugins_play to load vars for managed_node3 9733 1726773072.25837: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773072.25839: Calling groups_plugins_play to load vars for managed_node3 9733 1726773072.26003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773072.26112: done with get_vars() 9733 1726773072.26120: done getting variables TASK [Verify that settings were applied correctly after reboot] **************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_simple_settings.yml:54 Thursday 19 September 2024 15:11:12 -0400 (0:00:00.032) 0:00:17.994 **** 9733 1726773072.26179: entering _queue_task() for managed_node3/include_tasks 9733 1726773072.26342: worker is 1 (out of 1 available) 9733 1726773072.26356: exiting _queue_task() for managed_node3/include_tasks 9733 1726773072.26369: done queuing things up, now waiting for results queue to drain 9733 1726773072.26372: waiting for pending results... 10425 1726773072.26483: running TaskExecutor() for managed_node3/TASK: Verify that settings were applied correctly after reboot 10425 1726773072.26578: in run() - task 0affffe7-6841-7dd6-8fa6-00000000000d 10425 1726773072.26594: variable 'ansible_search_path' from source: unknown 10425 1726773072.26623: calling self._execute() 10425 1726773072.26695: variable 'ansible_host' from source: host vars for 'managed_node3' 10425 1726773072.26705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10425 1726773072.26713: variable 'omit' from source: magic vars 10425 1726773072.27044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10425 1726773072.28535: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10425 1726773072.28586: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10425 1726773072.28614: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10425 1726773072.28640: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10425 1726773072.28661: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10425 1726773072.28717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10425 1726773072.28738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10425 1726773072.28757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10425 1726773072.28789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10425 1726773072.28803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10425 1726773072.28878: variable 'kernel_settings_reboot_required' from source: set_fact 10425 1726773072.28896: Evaluated conditional (kernel_settings_reboot_required | d(false)): False 10425 1726773072.28901: when evaluation is False, skipping this task 10425 1726773072.28904: _execute() done 10425 1726773072.28908: dumping result to json 10425 1726773072.28911: done dumping result, returning 10425 1726773072.28918: done running TaskExecutor() for managed_node3/TASK: Verify that settings were applied correctly after reboot [0affffe7-6841-7dd6-8fa6-00000000000d] 10425 1726773072.28924: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000000d 10425 1726773072.28946: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000000d 10425 1726773072.28949: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "kernel_settings_reboot_required | d(false)", "skip_reason": "Conditional result was False" } 9733 1726773072.29058: no more pending results, returning what we have 9733 1726773072.29060: results queue empty 9733 1726773072.29061: checking for any_errors_fatal 9733 1726773072.29066: done checking for any_errors_fatal 9733 1726773072.29066: checking for max_fail_percentage 9733 1726773072.29068: done checking for max_fail_percentage 9733 1726773072.29068: checking to see if all hosts have failed and the running result is not ok 9733 1726773072.29069: done checking to see if all hosts have failed 9733 1726773072.29069: getting the remaining hosts for this loop 9733 1726773072.29073: done getting the remaining hosts for this loop 9733 1726773072.29076: getting the next task for host managed_node3 9733 1726773072.29081: done getting next task for host managed_node3 9733 1726773072.29083: ^ task is: TASK: Apply the settings again to check idempotency 9733 1726773072.29084: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773072.29088: getting variables 9733 1726773072.29089: in VariableManager get_vars() 9733 1726773072.29119: Calling all_inventory to load vars for managed_node3 9733 1726773072.29121: Calling groups_inventory to load vars for managed_node3 9733 1726773072.29123: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773072.29133: Calling all_plugins_play to load vars for managed_node3 9733 1726773072.29135: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773072.29138: Calling groups_plugins_play to load vars for managed_node3 9733 1726773072.29263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773072.29379: done with get_vars() 9733 1726773072.29389: done getting variables TASK [Apply the settings again to check idempotency] *************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_simple_settings.yml:63 Thursday 19 September 2024 15:11:12 -0400 (0:00:00.032) 0:00:18.027 **** 9733 1726773072.29448: entering _queue_task() for managed_node3/include_role 9733 1726773072.29623: worker is 1 (out of 1 available) 9733 1726773072.29638: exiting _queue_task() for managed_node3/include_role 9733 1726773072.29650: done queuing things up, now waiting for results queue to drain 9733 1726773072.29653: waiting for pending results... 10426 1726773072.29769: running TaskExecutor() for managed_node3/TASK: Apply the settings again to check idempotency 10426 1726773072.29867: in run() - task 0affffe7-6841-7dd6-8fa6-00000000000e 10426 1726773072.29883: variable 'ansible_search_path' from source: unknown 10426 1726773072.29914: calling self._execute() 10426 1726773072.29981: variable 'ansible_host' from source: host vars for 'managed_node3' 10426 1726773072.29992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10426 1726773072.30001: variable 'omit' from source: magic vars 10426 1726773072.30074: _execute() done 10426 1726773072.30080: dumping result to json 10426 1726773072.30084: done dumping result, returning 10426 1726773072.30091: done running TaskExecutor() for managed_node3/TASK: Apply the settings again to check idempotency [0affffe7-6841-7dd6-8fa6-00000000000e] 10426 1726773072.30098: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000000e 10426 1726773072.30119: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000000e 10426 1726773072.30122: WORKER PROCESS EXITING 9733 1726773072.30289: no more pending results, returning what we have 9733 1726773072.30292: in VariableManager get_vars() 9733 1726773072.30316: Calling all_inventory to load vars for managed_node3 9733 1726773072.30318: Calling groups_inventory to load vars for managed_node3 9733 1726773072.30319: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773072.30326: Calling all_plugins_play to load vars for managed_node3 9733 1726773072.30328: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773072.30330: Calling groups_plugins_play to load vars for managed_node3 9733 1726773072.30473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773072.30579: done with get_vars() 9733 1726773072.30587: variable 'ansible_search_path' from source: unknown 9733 1726773072.31494: variable 'omit' from source: magic vars 9733 1726773072.31511: variable 'omit' from source: magic vars 9733 1726773072.31520: variable 'omit' from source: magic vars 9733 1726773072.31523: we have included files to process 9733 1726773072.31523: generating all_blocks data 9733 1726773072.31524: done generating all_blocks data 9733 1726773072.31527: processing included file: fedora.linux_system_roles.kernel_settings 9733 1726773072.31541: in VariableManager get_vars() 9733 1726773072.31576: done with get_vars() 9733 1726773072.31599: in VariableManager get_vars() 9733 1726773072.31610: done with get_vars() 9733 1726773072.31636: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/main.yml 9733 1726773072.31682: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/defaults/main.yml 9733 1726773072.31699: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/meta/main.yml 9733 1726773072.31743: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml 9733 1726773072.32055: in VariableManager get_vars() 9733 1726773072.32069: done with get_vars() 9733 1726773072.32891: in VariableManager get_vars() 9733 1726773072.32906: done with get_vars() 9733 1726773072.33008: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/handlers/main.yml 9733 1726773072.33395: iterating over new_blocks loaded from include file 9733 1726773072.33397: in VariableManager get_vars() 9733 1726773072.33408: done with get_vars() 9733 1726773072.33410: filtering new block on tags 9733 1726773072.33446: done filtering new block on tags 9733 1726773072.33448: in VariableManager get_vars() 9733 1726773072.33456: done with get_vars() 9733 1726773072.33457: filtering new block on tags 9733 1726773072.33481: done filtering new block on tags 9733 1726773072.33483: in VariableManager get_vars() 9733 1726773072.33493: done with get_vars() 9733 1726773072.33494: filtering new block on tags 9733 1726773072.33574: done filtering new block on tags 9733 1726773072.33576: in VariableManager get_vars() 9733 1726773072.33587: done with get_vars() 9733 1726773072.33588: filtering new block on tags 9733 1726773072.33598: done filtering new block on tags 9733 1726773072.33599: done iterating over new_blocks loaded from include file 9733 1726773072.33600: extending task lists for all hosts with included blocks 9733 1726773072.34492: done extending task lists 9733 1726773072.34493: done processing included files 9733 1726773072.34493: results queue empty 9733 1726773072.34494: checking for any_errors_fatal 9733 1726773072.34496: done checking for any_errors_fatal 9733 1726773072.34496: checking for max_fail_percentage 9733 1726773072.34497: done checking for max_fail_percentage 9733 1726773072.34497: checking to see if all hosts have failed and the running result is not ok 9733 1726773072.34498: done checking to see if all hosts have failed 9733 1726773072.34498: getting the remaining hosts for this loop 9733 1726773072.34499: done getting the remaining hosts for this loop 9733 1726773072.34500: getting the next task for host managed_node3 9733 1726773072.34502: done getting next task for host managed_node3 9733 1726773072.34504: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 9733 1726773072.34505: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773072.34512: getting variables 9733 1726773072.34512: in VariableManager get_vars() 9733 1726773072.34521: Calling all_inventory to load vars for managed_node3 9733 1726773072.34522: Calling groups_inventory to load vars for managed_node3 9733 1726773072.34523: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773072.34527: Calling all_plugins_play to load vars for managed_node3 9733 1726773072.34528: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773072.34530: Calling groups_plugins_play to load vars for managed_node3 9733 1726773072.34626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773072.34734: done with get_vars() 9733 1726773072.34741: done getting variables 9733 1726773072.34765: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:2 Thursday 19 September 2024 15:11:12 -0400 (0:00:00.053) 0:00:18.080 **** 9733 1726773072.34788: entering _queue_task() for managed_node3/fail 9733 1726773072.34994: worker is 1 (out of 1 available) 9733 1726773072.35011: exiting _queue_task() for managed_node3/fail 9733 1726773072.35023: done queuing things up, now waiting for results queue to drain 9733 1726773072.35024: waiting for pending results... 10427 1726773072.35144: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 10427 1726773072.35256: in run() - task 0affffe7-6841-7dd6-8fa6-0000000003fd 10427 1726773072.35271: variable 'ansible_search_path' from source: unknown 10427 1726773072.35276: variable 'ansible_search_path' from source: unknown 10427 1726773072.35306: calling self._execute() 10427 1726773072.35370: variable 'ansible_host' from source: host vars for 'managed_node3' 10427 1726773072.35379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10427 1726773072.35390: variable 'omit' from source: magic vars 10427 1726773072.35717: variable 'kernel_settings_sysctl' from source: include_vars 10427 1726773072.35734: variable '__kernel_settings_state_empty' from source: role '' all vars 10427 1726773072.35742: Evaluated conditional (kernel_settings_sysctl != __kernel_settings_state_empty): True 10427 1726773072.35940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10427 1726773072.37604: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10427 1726773072.37652: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10427 1726773072.37680: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10427 1726773072.37708: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10427 1726773072.37728: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10427 1726773072.37782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10427 1726773072.37806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10427 1726773072.37825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10427 1726773072.37853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10427 1726773072.37865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10427 1726773072.37906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10427 1726773072.37923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10427 1726773072.37940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10427 1726773072.37968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10427 1726773072.37979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10427 1726773072.38007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10427 1726773072.38020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10427 1726773072.38033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10427 1726773072.38055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10427 1726773072.38063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10427 1726773072.38256: variable 'kernel_settings_sysctl' from source: include_vars 10427 1726773072.38311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10427 1726773072.38429: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10427 1726773072.38456: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10427 1726773072.38480: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10427 1726773072.38504: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10427 1726773072.38534: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10427 1726773072.38551: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10427 1726773072.38568: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10427 1726773072.38588: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10427 1726773072.38619: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10427 1726773072.38635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10427 1726773072.38653: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10427 1726773072.38670: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10427 1726773072.38693: Evaluated conditional ((kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", true) | list | length > 0) or (kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", false) | list | length > 0)): False 10427 1726773072.38698: when evaluation is False, skipping this task 10427 1726773072.38701: _execute() done 10427 1726773072.38705: dumping result to json 10427 1726773072.38708: done dumping result, returning 10427 1726773072.38714: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values [0affffe7-6841-7dd6-8fa6-0000000003fd] 10427 1726773072.38720: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000003fd 10427 1726773072.38740: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000003fd 10427 1726773072.38743: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "(kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", true) | list | length > 0) or (kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", false) | list | length > 0)", "skip_reason": "Conditional result was False" } 9733 1726773072.38992: no more pending results, returning what we have 9733 1726773072.38994: results queue empty 9733 1726773072.38995: checking for any_errors_fatal 9733 1726773072.38996: done checking for any_errors_fatal 9733 1726773072.38996: checking for max_fail_percentage 9733 1726773072.38997: done checking for max_fail_percentage 9733 1726773072.38998: checking to see if all hosts have failed and the running result is not ok 9733 1726773072.38998: done checking to see if all hosts have failed 9733 1726773072.38998: getting the remaining hosts for this loop 9733 1726773072.38999: done getting the remaining hosts for this loop 9733 1726773072.39001: getting the next task for host managed_node3 9733 1726773072.39005: done getting next task for host managed_node3 9733 1726773072.39008: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 9733 1726773072.39009: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773072.39020: getting variables 9733 1726773072.39021: in VariableManager get_vars() 9733 1726773072.39044: Calling all_inventory to load vars for managed_node3 9733 1726773072.39046: Calling groups_inventory to load vars for managed_node3 9733 1726773072.39048: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773072.39055: Calling all_plugins_play to load vars for managed_node3 9733 1726773072.39057: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773072.39058: Calling groups_plugins_play to load vars for managed_node3 9733 1726773072.39165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773072.39292: done with get_vars() 9733 1726773072.39300: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Set version specific variables] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:9 Thursday 19 September 2024 15:11:12 -0400 (0:00:00.045) 0:00:18.126 **** 9733 1726773072.39359: entering _queue_task() for managed_node3/include_tasks 9733 1726773072.39526: worker is 1 (out of 1 available) 9733 1726773072.39541: exiting _queue_task() for managed_node3/include_tasks 9733 1726773072.39554: done queuing things up, now waiting for results queue to drain 9733 1726773072.39556: waiting for pending results... 10428 1726773072.39675: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 10428 1726773072.39779: in run() - task 0affffe7-6841-7dd6-8fa6-0000000003fe 10428 1726773072.39796: variable 'ansible_search_path' from source: unknown 10428 1726773072.39801: variable 'ansible_search_path' from source: unknown 10428 1726773072.39830: calling self._execute() 10428 1726773072.39895: variable 'ansible_host' from source: host vars for 'managed_node3' 10428 1726773072.39904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10428 1726773072.39913: variable 'omit' from source: magic vars 10428 1726773072.40202: _execute() done 10428 1726773072.40209: dumping result to json 10428 1726773072.40214: done dumping result, returning 10428 1726773072.40220: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables [0affffe7-6841-7dd6-8fa6-0000000003fe] 10428 1726773072.40228: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000003fe 10428 1726773072.40249: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000003fe 10428 1726773072.40253: WORKER PROCESS EXITING 9733 1726773072.40359: no more pending results, returning what we have 9733 1726773072.40363: in VariableManager get_vars() 9733 1726773072.40400: Calling all_inventory to load vars for managed_node3 9733 1726773072.40403: Calling groups_inventory to load vars for managed_node3 9733 1726773072.40405: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773072.40413: Calling all_plugins_play to load vars for managed_node3 9733 1726773072.40415: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773072.40417: Calling groups_plugins_play to load vars for managed_node3 9733 1726773072.40684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773072.40794: done with get_vars() 9733 1726773072.40799: variable 'ansible_search_path' from source: unknown 9733 1726773072.40799: variable 'ansible_search_path' from source: unknown 9733 1726773072.40820: we have included files to process 9733 1726773072.40821: generating all_blocks data 9733 1726773072.40822: done generating all_blocks data 9733 1726773072.40824: processing included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 9733 1726773072.40825: loading included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 9733 1726773072.40826: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml included: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml for managed_node3 9733 1726773072.41258: done processing included file 9733 1726773072.41259: iterating over new_blocks loaded from include file 9733 1726773072.41260: in VariableManager get_vars() 9733 1726773072.41276: done with get_vars() 9733 1726773072.41277: filtering new block on tags 9733 1726773072.41296: done filtering new block on tags 9733 1726773072.41298: in VariableManager get_vars() 9733 1726773072.41310: done with get_vars() 9733 1726773072.41311: filtering new block on tags 9733 1726773072.41332: done filtering new block on tags 9733 1726773072.41333: in VariableManager get_vars() 9733 1726773072.41345: done with get_vars() 9733 1726773072.41346: filtering new block on tags 9733 1726773072.41389: done filtering new block on tags 9733 1726773072.41391: in VariableManager get_vars() 9733 1726773072.41406: done with get_vars() 9733 1726773072.41407: filtering new block on tags 9733 1726773072.41421: done filtering new block on tags 9733 1726773072.41422: done iterating over new_blocks loaded from include file 9733 1726773072.41423: extending task lists for all hosts with included blocks 9733 1726773072.41510: done extending task lists 9733 1726773072.41511: done processing included files 9733 1726773072.41512: results queue empty 9733 1726773072.41512: checking for any_errors_fatal 9733 1726773072.41514: done checking for any_errors_fatal 9733 1726773072.41514: checking for max_fail_percentage 9733 1726773072.41515: done checking for max_fail_percentage 9733 1726773072.41516: checking to see if all hosts have failed and the running result is not ok 9733 1726773072.41516: done checking to see if all hosts have failed 9733 1726773072.41516: getting the remaining hosts for this loop 9733 1726773072.41517: done getting the remaining hosts for this loop 9733 1726773072.41518: getting the next task for host managed_node3 9733 1726773072.41521: done getting next task for host managed_node3 9733 1726773072.41522: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 9733 1726773072.41524: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773072.41530: getting variables 9733 1726773072.41531: in VariableManager get_vars() 9733 1726773072.41539: Calling all_inventory to load vars for managed_node3 9733 1726773072.41541: Calling groups_inventory to load vars for managed_node3 9733 1726773072.41543: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773072.41546: Calling all_plugins_play to load vars for managed_node3 9733 1726773072.41547: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773072.41549: Calling groups_plugins_play to load vars for managed_node3 9733 1726773072.41626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773072.41735: done with get_vars() 9733 1726773072.41741: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:2 Thursday 19 September 2024 15:11:12 -0400 (0:00:00.024) 0:00:18.150 **** 9733 1726773072.41789: entering _queue_task() for managed_node3/setup 9733 1726773072.41963: worker is 1 (out of 1 available) 9733 1726773072.41980: exiting _queue_task() for managed_node3/setup 9733 1726773072.41993: done queuing things up, now waiting for results queue to drain 9733 1726773072.41995: waiting for pending results... 10429 1726773072.42110: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 10429 1726773072.42231: in run() - task 0affffe7-6841-7dd6-8fa6-00000000050b 10429 1726773072.42247: variable 'ansible_search_path' from source: unknown 10429 1726773072.42251: variable 'ansible_search_path' from source: unknown 10429 1726773072.42279: calling self._execute() 10429 1726773072.42344: variable 'ansible_host' from source: host vars for 'managed_node3' 10429 1726773072.42354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10429 1726773072.42362: variable 'omit' from source: magic vars 10429 1726773072.42758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10429 1726773072.44268: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10429 1726773072.44317: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10429 1726773072.44345: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10429 1726773072.44383: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10429 1726773072.44405: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10429 1726773072.44458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10429 1726773072.44482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10429 1726773072.44504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10429 1726773072.44531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10429 1726773072.44542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10429 1726773072.44579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10429 1726773072.44603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10429 1726773072.44621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10429 1726773072.44646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10429 1726773072.44656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10429 1726773072.44776: variable '__kernel_settings_required_facts' from source: role '' all vars 10429 1726773072.44788: variable 'ansible_facts' from source: unknown 10429 1726773072.44847: Evaluated conditional (__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 10429 1726773072.44852: when evaluation is False, skipping this task 10429 1726773072.44856: _execute() done 10429 1726773072.44859: dumping result to json 10429 1726773072.44863: done dumping result, returning 10429 1726773072.44870: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role [0affffe7-6841-7dd6-8fa6-00000000050b] 10429 1726773072.44876: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000050b 10429 1726773072.44900: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000050b 10429 1726773072.44903: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } 9733 1726773072.45010: no more pending results, returning what we have 9733 1726773072.45013: results queue empty 9733 1726773072.45013: checking for any_errors_fatal 9733 1726773072.45015: done checking for any_errors_fatal 9733 1726773072.45015: checking for max_fail_percentage 9733 1726773072.45017: done checking for max_fail_percentage 9733 1726773072.45017: checking to see if all hosts have failed and the running result is not ok 9733 1726773072.45018: done checking to see if all hosts have failed 9733 1726773072.45019: getting the remaining hosts for this loop 9733 1726773072.45019: done getting the remaining hosts for this loop 9733 1726773072.45022: getting the next task for host managed_node3 9733 1726773072.45029: done getting next task for host managed_node3 9733 1726773072.45033: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 9733 1726773072.45036: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773072.45050: getting variables 9733 1726773072.45051: in VariableManager get_vars() 9733 1726773072.45089: Calling all_inventory to load vars for managed_node3 9733 1726773072.45092: Calling groups_inventory to load vars for managed_node3 9733 1726773072.45094: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773072.45103: Calling all_plugins_play to load vars for managed_node3 9733 1726773072.45105: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773072.45107: Calling groups_plugins_play to load vars for managed_node3 9733 1726773072.45261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773072.45388: done with get_vars() 9733 1726773072.45397: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if system is ostree] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:10 Thursday 19 September 2024 15:11:12 -0400 (0:00:00.036) 0:00:18.187 **** 9733 1726773072.45458: entering _queue_task() for managed_node3/stat 9733 1726773072.45628: worker is 1 (out of 1 available) 9733 1726773072.45641: exiting _queue_task() for managed_node3/stat 9733 1726773072.45653: done queuing things up, now waiting for results queue to drain 9733 1726773072.45656: waiting for pending results... 10430 1726773072.45776: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 10430 1726773072.45896: in run() - task 0affffe7-6841-7dd6-8fa6-00000000050d 10430 1726773072.45914: variable 'ansible_search_path' from source: unknown 10430 1726773072.45918: variable 'ansible_search_path' from source: unknown 10430 1726773072.45946: calling self._execute() 10430 1726773072.46012: variable 'ansible_host' from source: host vars for 'managed_node3' 10430 1726773072.46022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10430 1726773072.46030: variable 'omit' from source: magic vars 10430 1726773072.46349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10430 1726773072.46522: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10430 1726773072.46557: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10430 1726773072.46586: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10430 1726773072.46613: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10430 1726773072.46675: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10430 1726773072.46696: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10430 1726773072.46715: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10430 1726773072.46733: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10430 1726773072.46820: variable '__kernel_settings_is_ostree' from source: set_fact 10430 1726773072.46831: Evaluated conditional (not __kernel_settings_is_ostree is defined): False 10430 1726773072.46835: when evaluation is False, skipping this task 10430 1726773072.46839: _execute() done 10430 1726773072.46842: dumping result to json 10430 1726773072.46846: done dumping result, returning 10430 1726773072.46852: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree [0affffe7-6841-7dd6-8fa6-00000000050d] 10430 1726773072.46858: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000050d 10430 1726773072.46878: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000050d 10430 1726773072.46881: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __kernel_settings_is_ostree is defined", "skip_reason": "Conditional result was False" } 9733 1726773072.47007: no more pending results, returning what we have 9733 1726773072.47009: results queue empty 9733 1726773072.47010: checking for any_errors_fatal 9733 1726773072.47015: done checking for any_errors_fatal 9733 1726773072.47016: checking for max_fail_percentage 9733 1726773072.47017: done checking for max_fail_percentage 9733 1726773072.47018: checking to see if all hosts have failed and the running result is not ok 9733 1726773072.47018: done checking to see if all hosts have failed 9733 1726773072.47019: getting the remaining hosts for this loop 9733 1726773072.47020: done getting the remaining hosts for this loop 9733 1726773072.47022: getting the next task for host managed_node3 9733 1726773072.47028: done getting next task for host managed_node3 9733 1726773072.47031: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 9733 1726773072.47034: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773072.47048: getting variables 9733 1726773072.47049: in VariableManager get_vars() 9733 1726773072.47078: Calling all_inventory to load vars for managed_node3 9733 1726773072.47080: Calling groups_inventory to load vars for managed_node3 9733 1726773072.47081: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773072.47090: Calling all_plugins_play to load vars for managed_node3 9733 1726773072.47091: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773072.47093: Calling groups_plugins_play to load vars for managed_node3 9733 1726773072.47196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773072.47313: done with get_vars() 9733 1726773072.47320: done getting variables 9733 1726773072.47356: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:15 Thursday 19 September 2024 15:11:12 -0400 (0:00:00.019) 0:00:18.206 **** 9733 1726773072.47378: entering _queue_task() for managed_node3/set_fact 9733 1726773072.47529: worker is 1 (out of 1 available) 9733 1726773072.47543: exiting _queue_task() for managed_node3/set_fact 9733 1726773072.47554: done queuing things up, now waiting for results queue to drain 9733 1726773072.47556: waiting for pending results... 10431 1726773072.47675: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 10431 1726773072.47797: in run() - task 0affffe7-6841-7dd6-8fa6-00000000050e 10431 1726773072.47814: variable 'ansible_search_path' from source: unknown 10431 1726773072.47818: variable 'ansible_search_path' from source: unknown 10431 1726773072.47845: calling self._execute() 10431 1726773072.47911: variable 'ansible_host' from source: host vars for 'managed_node3' 10431 1726773072.47920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10431 1726773072.47928: variable 'omit' from source: magic vars 10431 1726773072.48239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10431 1726773072.48463: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10431 1726773072.48499: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10431 1726773072.48523: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10431 1726773072.48550: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10431 1726773072.48610: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10431 1726773072.48629: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10431 1726773072.48648: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10431 1726773072.48666: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10431 1726773072.48754: variable '__kernel_settings_is_ostree' from source: set_fact 10431 1726773072.48765: Evaluated conditional (not __kernel_settings_is_ostree is defined): False 10431 1726773072.48768: when evaluation is False, skipping this task 10431 1726773072.48772: _execute() done 10431 1726773072.48775: dumping result to json 10431 1726773072.48777: done dumping result, returning 10431 1726773072.48780: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree [0affffe7-6841-7dd6-8fa6-00000000050e] 10431 1726773072.48786: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000050e 10431 1726773072.48811: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000050e 10431 1726773072.48815: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __kernel_settings_is_ostree is defined", "skip_reason": "Conditional result was False" } 9733 1726773072.48912: no more pending results, returning what we have 9733 1726773072.48914: results queue empty 9733 1726773072.48915: checking for any_errors_fatal 9733 1726773072.48920: done checking for any_errors_fatal 9733 1726773072.48921: checking for max_fail_percentage 9733 1726773072.48922: done checking for max_fail_percentage 9733 1726773072.48922: checking to see if all hosts have failed and the running result is not ok 9733 1726773072.48923: done checking to see if all hosts have failed 9733 1726773072.48923: getting the remaining hosts for this loop 9733 1726773072.48924: done getting the remaining hosts for this loop 9733 1726773072.48927: getting the next task for host managed_node3 9733 1726773072.48935: done getting next task for host managed_node3 9733 1726773072.48938: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 9733 1726773072.48941: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773072.48955: getting variables 9733 1726773072.48957: in VariableManager get_vars() 9733 1726773072.48986: Calling all_inventory to load vars for managed_node3 9733 1726773072.48989: Calling groups_inventory to load vars for managed_node3 9733 1726773072.48991: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773072.49000: Calling all_plugins_play to load vars for managed_node3 9733 1726773072.49002: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773072.49005: Calling groups_plugins_play to load vars for managed_node3 9733 1726773072.49144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773072.49258: done with get_vars() 9733 1726773072.49265: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:22 Thursday 19 September 2024 15:11:12 -0400 (0:00:00.019) 0:00:18.225 **** 9733 1726773072.49326: entering _queue_task() for managed_node3/stat 9733 1726773072.49476: worker is 1 (out of 1 available) 9733 1726773072.49490: exiting _queue_task() for managed_node3/stat 9733 1726773072.49502: done queuing things up, now waiting for results queue to drain 9733 1726773072.49504: waiting for pending results... 10432 1726773072.49621: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 10432 1726773072.49741: in run() - task 0affffe7-6841-7dd6-8fa6-000000000510 10432 1726773072.49757: variable 'ansible_search_path' from source: unknown 10432 1726773072.49762: variable 'ansible_search_path' from source: unknown 10432 1726773072.49792: calling self._execute() 10432 1726773072.49854: variable 'ansible_host' from source: host vars for 'managed_node3' 10432 1726773072.49863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10432 1726773072.49874: variable 'omit' from source: magic vars 10432 1726773072.50187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10432 1726773072.50358: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10432 1726773072.50395: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10432 1726773072.50421: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10432 1726773072.50446: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10432 1726773072.50508: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10432 1726773072.50529: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10432 1726773072.50548: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10432 1726773072.50568: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10432 1726773072.50652: variable '__kernel_settings_is_transactional' from source: set_fact 10432 1726773072.50663: Evaluated conditional (not __kernel_settings_is_transactional is defined): False 10432 1726773072.50668: when evaluation is False, skipping this task 10432 1726773072.50673: _execute() done 10432 1726773072.50677: dumping result to json 10432 1726773072.50681: done dumping result, returning 10432 1726773072.50688: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin [0affffe7-6841-7dd6-8fa6-000000000510] 10432 1726773072.50694: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000510 10432 1726773072.50715: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000510 10432 1726773072.50717: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __kernel_settings_is_transactional is defined", "skip_reason": "Conditional result was False" } 9733 1726773072.50887: no more pending results, returning what we have 9733 1726773072.50890: results queue empty 9733 1726773072.50890: checking for any_errors_fatal 9733 1726773072.50894: done checking for any_errors_fatal 9733 1726773072.50894: checking for max_fail_percentage 9733 1726773072.50895: done checking for max_fail_percentage 9733 1726773072.50896: checking to see if all hosts have failed and the running result is not ok 9733 1726773072.50896: done checking to see if all hosts have failed 9733 1726773072.50896: getting the remaining hosts for this loop 9733 1726773072.50897: done getting the remaining hosts for this loop 9733 1726773072.50900: getting the next task for host managed_node3 9733 1726773072.50903: done getting next task for host managed_node3 9733 1726773072.50906: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 9733 1726773072.50908: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773072.50918: getting variables 9733 1726773072.50919: in VariableManager get_vars() 9733 1726773072.50940: Calling all_inventory to load vars for managed_node3 9733 1726773072.50941: Calling groups_inventory to load vars for managed_node3 9733 1726773072.50942: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773072.50948: Calling all_plugins_play to load vars for managed_node3 9733 1726773072.50950: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773072.50951: Calling groups_plugins_play to load vars for managed_node3 9733 1726773072.51055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773072.51174: done with get_vars() 9733 1726773072.51181: done getting variables 9733 1726773072.51221: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:27 Thursday 19 September 2024 15:11:12 -0400 (0:00:00.019) 0:00:18.244 **** 9733 1726773072.51242: entering _queue_task() for managed_node3/set_fact 9733 1726773072.51398: worker is 1 (out of 1 available) 9733 1726773072.51410: exiting _queue_task() for managed_node3/set_fact 9733 1726773072.51421: done queuing things up, now waiting for results queue to drain 9733 1726773072.51423: waiting for pending results... 10433 1726773072.51546: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 10433 1726773072.51666: in run() - task 0affffe7-6841-7dd6-8fa6-000000000511 10433 1726773072.51687: variable 'ansible_search_path' from source: unknown 10433 1726773072.51693: variable 'ansible_search_path' from source: unknown 10433 1726773072.51720: calling self._execute() 10433 1726773072.51786: variable 'ansible_host' from source: host vars for 'managed_node3' 10433 1726773072.51795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10433 1726773072.51803: variable 'omit' from source: magic vars 10433 1726773072.52123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10433 1726773072.52355: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10433 1726773072.52391: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10433 1726773072.52418: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10433 1726773072.52444: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10433 1726773072.52505: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10433 1726773072.52527: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10433 1726773072.52546: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10433 1726773072.52564: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10433 1726773072.52647: variable '__kernel_settings_is_transactional' from source: set_fact 10433 1726773072.52658: Evaluated conditional (not __kernel_settings_is_transactional is defined): False 10433 1726773072.52662: when evaluation is False, skipping this task 10433 1726773072.52665: _execute() done 10433 1726773072.52669: dumping result to json 10433 1726773072.52675: done dumping result, returning 10433 1726773072.52682: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists [0affffe7-6841-7dd6-8fa6-000000000511] 10433 1726773072.52689: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000511 10433 1726773072.52710: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000511 10433 1726773072.52714: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __kernel_settings_is_transactional is defined", "skip_reason": "Conditional result was False" } 9733 1726773072.52814: no more pending results, returning what we have 9733 1726773072.52817: results queue empty 9733 1726773072.52818: checking for any_errors_fatal 9733 1726773072.52822: done checking for any_errors_fatal 9733 1726773072.52823: checking for max_fail_percentage 9733 1726773072.52824: done checking for max_fail_percentage 9733 1726773072.52825: checking to see if all hosts have failed and the running result is not ok 9733 1726773072.52825: done checking to see if all hosts have failed 9733 1726773072.52826: getting the remaining hosts for this loop 9733 1726773072.52827: done getting the remaining hosts for this loop 9733 1726773072.52829: getting the next task for host managed_node3 9733 1726773072.52837: done getting next task for host managed_node3 9733 1726773072.52840: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 9733 1726773072.52843: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773072.52856: getting variables 9733 1726773072.52857: in VariableManager get_vars() 9733 1726773072.52887: Calling all_inventory to load vars for managed_node3 9733 1726773072.52889: Calling groups_inventory to load vars for managed_node3 9733 1726773072.52891: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773072.52900: Calling all_plugins_play to load vars for managed_node3 9733 1726773072.52903: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773072.52905: Calling groups_plugins_play to load vars for managed_node3 9733 1726773072.53048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773072.53164: done with get_vars() 9733 1726773072.53171: done getting variables 9733 1726773072.53210: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set platform/version specific variables] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:31 Thursday 19 September 2024 15:11:12 -0400 (0:00:00.019) 0:00:18.264 **** 9733 1726773072.53231: entering _queue_task() for managed_node3/include_vars 9733 1726773072.53382: worker is 1 (out of 1 available) 9733 1726773072.53397: exiting _queue_task() for managed_node3/include_vars 9733 1726773072.53409: done queuing things up, now waiting for results queue to drain 9733 1726773072.53411: waiting for pending results... 10434 1726773072.53530: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 10434 1726773072.53645: in run() - task 0affffe7-6841-7dd6-8fa6-000000000513 10434 1726773072.53662: variable 'ansible_search_path' from source: unknown 10434 1726773072.53667: variable 'ansible_search_path' from source: unknown 10434 1726773072.53696: calling self._execute() 10434 1726773072.53760: variable 'ansible_host' from source: host vars for 'managed_node3' 10434 1726773072.53768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10434 1726773072.53778: variable 'omit' from source: magic vars 10434 1726773072.53850: variable 'omit' from source: magic vars 10434 1726773072.53898: variable 'omit' from source: magic vars 10434 1726773072.54157: variable 'ffparams' from source: task vars 10434 1726773072.54253: variable 'ansible_facts' from source: unknown 10434 1726773072.54392: variable 'ansible_facts' from source: unknown 10434 1726773072.54478: variable 'ansible_facts' from source: unknown 10434 1726773072.54564: variable 'ansible_facts' from source: unknown 10434 1726773072.54642: variable 'role_path' from source: magic vars 10434 1726773072.54763: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 10434 1726773072.54922: Loaded config def from plugin (lookup/first_found) 10434 1726773072.54930: Loading LookupModule 'first_found' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/first_found.py 10434 1726773072.54955: variable 'ansible_search_path' from source: unknown 10434 1726773072.54972: variable 'ansible_search_path' from source: unknown 10434 1726773072.54978: variable 'ansible_search_path' from source: unknown 10434 1726773072.54983: variable 'ansible_search_path' from source: unknown 10434 1726773072.54989: variable 'ansible_search_path' from source: unknown 10434 1726773072.55002: variable 'omit' from source: magic vars 10434 1726773072.55019: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10434 1726773072.55035: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10434 1726773072.55049: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10434 1726773072.55061: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10434 1726773072.55068: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10434 1726773072.55089: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10434 1726773072.55093: variable 'ansible_host' from source: host vars for 'managed_node3' 10434 1726773072.55095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10434 1726773072.55156: Set connection var ansible_timeout to 10 10434 1726773072.55160: Set connection var ansible_shell_type to sh 10434 1726773072.55164: Set connection var ansible_module_compression to ZIP_DEFLATED 10434 1726773072.55167: Set connection var ansible_shell_executable to /bin/sh 10434 1726773072.55170: Set connection var ansible_pipelining to False 10434 1726773072.55175: Set connection var ansible_connection to ssh 10434 1726773072.55190: variable 'ansible_shell_executable' from source: unknown 10434 1726773072.55195: variable 'ansible_connection' from source: unknown 10434 1726773072.55198: variable 'ansible_module_compression' from source: unknown 10434 1726773072.55201: variable 'ansible_shell_type' from source: unknown 10434 1726773072.55204: variable 'ansible_shell_executable' from source: unknown 10434 1726773072.55207: variable 'ansible_host' from source: host vars for 'managed_node3' 10434 1726773072.55211: variable 'ansible_pipelining' from source: unknown 10434 1726773072.55215: variable 'ansible_timeout' from source: unknown 10434 1726773072.55218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10434 1726773072.55292: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10434 1726773072.55303: variable 'omit' from source: magic vars 10434 1726773072.55309: starting attempt loop 10434 1726773072.55313: running the handler 10434 1726773072.55355: handler run complete 10434 1726773072.55365: attempt loop complete, returning result 10434 1726773072.55369: _execute() done 10434 1726773072.55373: dumping result to json 10434 1726773072.55377: done dumping result, returning 10434 1726773072.55384: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables [0affffe7-6841-7dd6-8fa6-000000000513] 10434 1726773072.55391: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000513 10434 1726773072.55415: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000513 10434 1726773072.55418: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__kernel_settings_packages": [ "tuned", "python3-configobj" ], "__kernel_settings_services": [ "tuned" ] }, "ansible_included_var_files": [ "/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/default.yml" ], "changed": false } 9733 1726773072.55567: no more pending results, returning what we have 9733 1726773072.55570: results queue empty 9733 1726773072.55573: checking for any_errors_fatal 9733 1726773072.55578: done checking for any_errors_fatal 9733 1726773072.55578: checking for max_fail_percentage 9733 1726773072.55579: done checking for max_fail_percentage 9733 1726773072.55580: checking to see if all hosts have failed and the running result is not ok 9733 1726773072.55580: done checking to see if all hosts have failed 9733 1726773072.55581: getting the remaining hosts for this loop 9733 1726773072.55582: done getting the remaining hosts for this loop 9733 1726773072.55586: getting the next task for host managed_node3 9733 1726773072.55592: done getting next task for host managed_node3 9733 1726773072.55595: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 9733 1726773072.55597: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773072.55608: getting variables 9733 1726773072.55609: in VariableManager get_vars() 9733 1726773072.55634: Calling all_inventory to load vars for managed_node3 9733 1726773072.55636: Calling groups_inventory to load vars for managed_node3 9733 1726773072.55637: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773072.55644: Calling all_plugins_play to load vars for managed_node3 9733 1726773072.55645: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773072.55647: Calling groups_plugins_play to load vars for managed_node3 9733 1726773072.55751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773072.55873: done with get_vars() 9733 1726773072.55880: done getting variables 9733 1726773072.55920: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required packages are installed] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 Thursday 19 September 2024 15:11:12 -0400 (0:00:00.027) 0:00:18.291 **** 9733 1726773072.55940: entering _queue_task() for managed_node3/package 9733 1726773072.56098: worker is 1 (out of 1 available) 9733 1726773072.56113: exiting _queue_task() for managed_node3/package 9733 1726773072.56123: done queuing things up, now waiting for results queue to drain 9733 1726773072.56125: waiting for pending results... 10435 1726773072.56241: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 10435 1726773072.56340: in run() - task 0affffe7-6841-7dd6-8fa6-0000000003ff 10435 1726773072.56355: variable 'ansible_search_path' from source: unknown 10435 1726773072.56358: variable 'ansible_search_path' from source: unknown 10435 1726773072.56383: calling self._execute() 10435 1726773072.56450: variable 'ansible_host' from source: host vars for 'managed_node3' 10435 1726773072.56458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10435 1726773072.56464: variable 'omit' from source: magic vars 10435 1726773072.56530: variable 'omit' from source: magic vars 10435 1726773072.56561: variable 'omit' from source: magic vars 10435 1726773072.56580: variable '__kernel_settings_packages' from source: include_vars 10435 1726773072.56843: variable '__kernel_settings_packages' from source: include_vars 10435 1726773072.57001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10435 1726773072.58462: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10435 1726773072.58519: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10435 1726773072.58548: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10435 1726773072.58576: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10435 1726773072.58598: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10435 1726773072.58665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10435 1726773072.58689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10435 1726773072.58709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10435 1726773072.58735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10435 1726773072.58747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10435 1726773072.58821: variable '__kernel_settings_is_ostree' from source: set_fact 10435 1726773072.58828: variable 'omit' from source: magic vars 10435 1726773072.58850: variable 'omit' from source: magic vars 10435 1726773072.58872: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10435 1726773072.58894: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10435 1726773072.58908: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10435 1726773072.58920: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10435 1726773072.58927: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10435 1726773072.58947: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10435 1726773072.58950: variable 'ansible_host' from source: host vars for 'managed_node3' 10435 1726773072.58952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10435 1726773072.59030: Set connection var ansible_timeout to 10 10435 1726773072.59035: Set connection var ansible_shell_type to sh 10435 1726773072.59041: Set connection var ansible_module_compression to ZIP_DEFLATED 10435 1726773072.59047: Set connection var ansible_shell_executable to /bin/sh 10435 1726773072.59052: Set connection var ansible_pipelining to False 10435 1726773072.59058: Set connection var ansible_connection to ssh 10435 1726773072.59075: variable 'ansible_shell_executable' from source: unknown 10435 1726773072.59079: variable 'ansible_connection' from source: unknown 10435 1726773072.59082: variable 'ansible_module_compression' from source: unknown 10435 1726773072.59087: variable 'ansible_shell_type' from source: unknown 10435 1726773072.59090: variable 'ansible_shell_executable' from source: unknown 10435 1726773072.59094: variable 'ansible_host' from source: host vars for 'managed_node3' 10435 1726773072.59098: variable 'ansible_pipelining' from source: unknown 10435 1726773072.59101: variable 'ansible_timeout' from source: unknown 10435 1726773072.59106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10435 1726773072.59166: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10435 1726773072.59179: variable 'omit' from source: magic vars 10435 1726773072.59188: starting attempt loop 10435 1726773072.59191: running the handler 10435 1726773072.59251: variable 'ansible_facts' from source: unknown 10435 1726773072.59331: _low_level_execute_command(): starting 10435 1726773072.59340: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10435 1726773072.61697: stdout chunk (state=2): >>>/root <<< 10435 1726773072.61820: stderr chunk (state=3): >>><<< 10435 1726773072.61828: stdout chunk (state=3): >>><<< 10435 1726773072.61846: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10435 1726773072.61857: _low_level_execute_command(): starting 10435 1726773072.61863: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773072.6185334-10435-213324519525870 `" && echo ansible-tmp-1726773072.6185334-10435-213324519525870="` echo /root/.ansible/tmp/ansible-tmp-1726773072.6185334-10435-213324519525870 `" ) && sleep 0' 10435 1726773072.64351: stdout chunk (state=2): >>>ansible-tmp-1726773072.6185334-10435-213324519525870=/root/.ansible/tmp/ansible-tmp-1726773072.6185334-10435-213324519525870 <<< 10435 1726773072.64486: stderr chunk (state=3): >>><<< 10435 1726773072.64494: stdout chunk (state=3): >>><<< 10435 1726773072.64509: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773072.6185334-10435-213324519525870=/root/.ansible/tmp/ansible-tmp-1726773072.6185334-10435-213324519525870 , stderr= 10435 1726773072.64535: variable 'ansible_module_compression' from source: unknown 10435 1726773072.64579: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 10435 1726773072.64618: variable 'ansible_facts' from source: unknown 10435 1726773072.64709: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773072.6185334-10435-213324519525870/AnsiballZ_dnf.py 10435 1726773072.64812: Sending initial data 10435 1726773072.64819: Sent initial data (151 bytes) 10435 1726773072.67335: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmppeprqusz /root/.ansible/tmp/ansible-tmp-1726773072.6185334-10435-213324519525870/AnsiballZ_dnf.py <<< 10435 1726773072.68778: stderr chunk (state=3): >>><<< 10435 1726773072.68790: stdout chunk (state=3): >>><<< 10435 1726773072.68810: done transferring module to remote 10435 1726773072.68822: _low_level_execute_command(): starting 10435 1726773072.68827: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773072.6185334-10435-213324519525870/ /root/.ansible/tmp/ansible-tmp-1726773072.6185334-10435-213324519525870/AnsiballZ_dnf.py && sleep 0' 10435 1726773072.71229: stderr chunk (state=2): >>><<< 10435 1726773072.71240: stdout chunk (state=2): >>><<< 10435 1726773072.71255: _low_level_execute_command() done: rc=0, stdout=, stderr= 10435 1726773072.71259: _low_level_execute_command(): starting 10435 1726773072.71265: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773072.6185334-10435-213324519525870/AnsiballZ_dnf.py && sleep 0' 10435 1726773075.25730: stdout chunk (state=2): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} <<< 10435 1726773075.33550: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10435 1726773075.33603: stderr chunk (state=3): >>><<< 10435 1726773075.33611: stdout chunk (state=3): >>><<< 10435 1726773075.33628: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} , stderr=Shared connection to 10.31.47.99 closed. 10435 1726773075.33661: done with _execute_module (ansible.legacy.dnf, {'name': ['tuned', 'python3-configobj'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773072.6185334-10435-213324519525870/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10435 1726773075.33671: _low_level_execute_command(): starting 10435 1726773075.33677: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773072.6185334-10435-213324519525870/ > /dev/null 2>&1 && sleep 0' 10435 1726773075.36173: stderr chunk (state=2): >>><<< 10435 1726773075.36183: stdout chunk (state=2): >>><<< 10435 1726773075.36199: _low_level_execute_command() done: rc=0, stdout=, stderr= 10435 1726773075.36206: handler run complete 10435 1726773075.36231: attempt loop complete, returning result 10435 1726773075.36235: _execute() done 10435 1726773075.36239: dumping result to json 10435 1726773075.36245: done dumping result, returning 10435 1726773075.36252: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed [0affffe7-6841-7dd6-8fa6-0000000003ff] 10435 1726773075.36258: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000003ff 10435 1726773075.36290: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000003ff 10435 1726773075.36292: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 9733 1726773075.36558: no more pending results, returning what we have 9733 1726773075.36562: results queue empty 9733 1726773075.36562: checking for any_errors_fatal 9733 1726773075.36568: done checking for any_errors_fatal 9733 1726773075.36569: checking for max_fail_percentage 9733 1726773075.36570: done checking for max_fail_percentage 9733 1726773075.36573: checking to see if all hosts have failed and the running result is not ok 9733 1726773075.36573: done checking to see if all hosts have failed 9733 1726773075.36573: getting the remaining hosts for this loop 9733 1726773075.36574: done getting the remaining hosts for this loop 9733 1726773075.36577: getting the next task for host managed_node3 9733 1726773075.36581: done getting next task for host managed_node3 9733 1726773075.36584: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 9733 1726773075.36588: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773075.36596: getting variables 9733 1726773075.36596: in VariableManager get_vars() 9733 1726773075.36619: Calling all_inventory to load vars for managed_node3 9733 1726773075.36621: Calling groups_inventory to load vars for managed_node3 9733 1726773075.36622: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773075.36628: Calling all_plugins_play to load vars for managed_node3 9733 1726773075.36630: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773075.36631: Calling groups_plugins_play to load vars for managed_node3 9733 1726773075.36782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773075.36900: done with get_vars() 9733 1726773075.36908: done getting variables 9733 1726773075.36946: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:24 Thursday 19 September 2024 15:11:15 -0400 (0:00:02.810) 0:00:21.102 **** 9733 1726773075.36966: entering _queue_task() for managed_node3/debug 9733 1726773075.37141: worker is 1 (out of 1 available) 9733 1726773075.37158: exiting _queue_task() for managed_node3/debug 9733 1726773075.37174: done queuing things up, now waiting for results queue to drain 9733 1726773075.37176: waiting for pending results... 10471 1726773075.37296: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 10471 1726773075.37414: in run() - task 0affffe7-6841-7dd6-8fa6-000000000401 10471 1726773075.37430: variable 'ansible_search_path' from source: unknown 10471 1726773075.37436: variable 'ansible_search_path' from source: unknown 10471 1726773075.37464: calling self._execute() 10471 1726773075.37533: variable 'ansible_host' from source: host vars for 'managed_node3' 10471 1726773075.37542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10471 1726773075.37550: variable 'omit' from source: magic vars 10471 1726773075.37883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10471 1726773075.39404: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10471 1726773075.39451: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10471 1726773075.39483: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10471 1726773075.39513: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10471 1726773075.39533: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10471 1726773075.39593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10471 1726773075.39615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10471 1726773075.39634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10471 1726773075.39661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10471 1726773075.39676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10471 1726773075.39753: variable '__kernel_settings_is_transactional' from source: set_fact 10471 1726773075.39769: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 10471 1726773075.39776: when evaluation is False, skipping this task 10471 1726773075.39780: _execute() done 10471 1726773075.39783: dumping result to json 10471 1726773075.39789: done dumping result, returning 10471 1726773075.39795: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes [0affffe7-6841-7dd6-8fa6-000000000401] 10471 1726773075.39801: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000401 10471 1726773075.39824: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000401 10471 1726773075.39827: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "__kernel_settings_is_transactional | d(false)" } 9733 1726773075.39939: no more pending results, returning what we have 9733 1726773075.39942: results queue empty 9733 1726773075.39943: checking for any_errors_fatal 9733 1726773075.39950: done checking for any_errors_fatal 9733 1726773075.39951: checking for max_fail_percentage 9733 1726773075.39952: done checking for max_fail_percentage 9733 1726773075.39953: checking to see if all hosts have failed and the running result is not ok 9733 1726773075.39953: done checking to see if all hosts have failed 9733 1726773075.39954: getting the remaining hosts for this loop 9733 1726773075.39955: done getting the remaining hosts for this loop 9733 1726773075.39957: getting the next task for host managed_node3 9733 1726773075.39963: done getting next task for host managed_node3 9733 1726773075.39967: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 9733 1726773075.39969: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773075.39983: getting variables 9733 1726773075.39986: in VariableManager get_vars() 9733 1726773075.40018: Calling all_inventory to load vars for managed_node3 9733 1726773075.40020: Calling groups_inventory to load vars for managed_node3 9733 1726773075.40022: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773075.40031: Calling all_plugins_play to load vars for managed_node3 9733 1726773075.40034: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773075.40036: Calling groups_plugins_play to load vars for managed_node3 9733 1726773075.40154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773075.40271: done with get_vars() 9733 1726773075.40280: done getting variables 9733 1726773075.40323: Loading ActionModule 'reboot' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Reboot transactional update systems] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:29 Thursday 19 September 2024 15:11:15 -0400 (0:00:00.033) 0:00:21.135 **** 9733 1726773075.40346: entering _queue_task() for managed_node3/reboot 9733 1726773075.40505: worker is 1 (out of 1 available) 9733 1726773075.40522: exiting _queue_task() for managed_node3/reboot 9733 1726773075.40534: done queuing things up, now waiting for results queue to drain 9733 1726773075.40537: waiting for pending results... 10472 1726773075.40659: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 10472 1726773075.40768: in run() - task 0affffe7-6841-7dd6-8fa6-000000000402 10472 1726773075.40788: variable 'ansible_search_path' from source: unknown 10472 1726773075.40792: variable 'ansible_search_path' from source: unknown 10472 1726773075.40819: calling self._execute() 10472 1726773075.40887: variable 'ansible_host' from source: host vars for 'managed_node3' 10472 1726773075.40896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10472 1726773075.40904: variable 'omit' from source: magic vars 10472 1726773075.41313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10472 1726773075.43057: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10472 1726773075.43119: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10472 1726773075.43149: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10472 1726773075.43178: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10472 1726773075.43204: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10472 1726773075.43258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10472 1726773075.43284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10472 1726773075.43305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10472 1726773075.43334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10472 1726773075.43346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10472 1726773075.43426: variable '__kernel_settings_is_transactional' from source: set_fact 10472 1726773075.43443: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 10472 1726773075.43448: when evaluation is False, skipping this task 10472 1726773075.43451: _execute() done 10472 1726773075.43455: dumping result to json 10472 1726773075.43459: done dumping result, returning 10472 1726773075.43465: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems [0affffe7-6841-7dd6-8fa6-000000000402] 10472 1726773075.43473: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000402 10472 1726773075.43498: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000402 10472 1726773075.43501: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 9733 1726773075.43604: no more pending results, returning what we have 9733 1726773075.43607: results queue empty 9733 1726773075.43608: checking for any_errors_fatal 9733 1726773075.43613: done checking for any_errors_fatal 9733 1726773075.43613: checking for max_fail_percentage 9733 1726773075.43615: done checking for max_fail_percentage 9733 1726773075.43615: checking to see if all hosts have failed and the running result is not ok 9733 1726773075.43616: done checking to see if all hosts have failed 9733 1726773075.43616: getting the remaining hosts for this loop 9733 1726773075.43617: done getting the remaining hosts for this loop 9733 1726773075.43620: getting the next task for host managed_node3 9733 1726773075.43626: done getting next task for host managed_node3 9733 1726773075.43630: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 9733 1726773075.43632: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773075.43647: getting variables 9733 1726773075.43648: in VariableManager get_vars() 9733 1726773075.43682: Calling all_inventory to load vars for managed_node3 9733 1726773075.43687: Calling groups_inventory to load vars for managed_node3 9733 1726773075.43689: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773075.43698: Calling all_plugins_play to load vars for managed_node3 9733 1726773075.43700: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773075.43703: Calling groups_plugins_play to load vars for managed_node3 9733 1726773075.43861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773075.43978: done with get_vars() 9733 1726773075.43987: done getting variables 9733 1726773075.44027: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:34 Thursday 19 September 2024 15:11:15 -0400 (0:00:00.037) 0:00:21.173 **** 9733 1726773075.44047: entering _queue_task() for managed_node3/fail 9733 1726773075.44210: worker is 1 (out of 1 available) 9733 1726773075.44225: exiting _queue_task() for managed_node3/fail 9733 1726773075.44237: done queuing things up, now waiting for results queue to drain 9733 1726773075.44239: waiting for pending results... 10474 1726773075.44369: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 10474 1726773075.44487: in run() - task 0affffe7-6841-7dd6-8fa6-000000000403 10474 1726773075.44505: variable 'ansible_search_path' from source: unknown 10474 1726773075.44510: variable 'ansible_search_path' from source: unknown 10474 1726773075.44537: calling self._execute() 10474 1726773075.44609: variable 'ansible_host' from source: host vars for 'managed_node3' 10474 1726773075.44616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10474 1726773075.44622: variable 'omit' from source: magic vars 10474 1726773075.45000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10474 1726773075.47154: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10474 1726773075.47209: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10474 1726773075.47238: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10474 1726773075.47266: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10474 1726773075.47290: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10474 1726773075.47347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10474 1726773075.47368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10474 1726773075.47392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10474 1726773075.47422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10474 1726773075.47434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10474 1726773075.47516: variable '__kernel_settings_is_transactional' from source: set_fact 10474 1726773075.47534: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 10474 1726773075.47539: when evaluation is False, skipping this task 10474 1726773075.47543: _execute() done 10474 1726773075.47546: dumping result to json 10474 1726773075.47550: done dumping result, returning 10474 1726773075.47557: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set [0affffe7-6841-7dd6-8fa6-000000000403] 10474 1726773075.47563: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000403 10474 1726773075.47592: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000403 10474 1726773075.47595: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 9733 1726773075.47708: no more pending results, returning what we have 9733 1726773075.47711: results queue empty 9733 1726773075.47711: checking for any_errors_fatal 9733 1726773075.47716: done checking for any_errors_fatal 9733 1726773075.47716: checking for max_fail_percentage 9733 1726773075.47718: done checking for max_fail_percentage 9733 1726773075.47719: checking to see if all hosts have failed and the running result is not ok 9733 1726773075.47719: done checking to see if all hosts have failed 9733 1726773075.47720: getting the remaining hosts for this loop 9733 1726773075.47721: done getting the remaining hosts for this loop 9733 1726773075.47724: getting the next task for host managed_node3 9733 1726773075.47731: done getting next task for host managed_node3 9733 1726773075.47735: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 9733 1726773075.47737: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773075.47750: getting variables 9733 1726773075.47752: in VariableManager get_vars() 9733 1726773075.47786: Calling all_inventory to load vars for managed_node3 9733 1726773075.47789: Calling groups_inventory to load vars for managed_node3 9733 1726773075.47791: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773075.47800: Calling all_plugins_play to load vars for managed_node3 9733 1726773075.47803: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773075.47805: Calling groups_plugins_play to load vars for managed_node3 9733 1726773075.47937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773075.48058: done with get_vars() 9733 1726773075.48066: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Read tuned main config] ****** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42 Thursday 19 September 2024 15:11:15 -0400 (0:00:00.040) 0:00:21.213 **** 9733 1726773075.48127: entering _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config 9733 1726773075.48295: worker is 1 (out of 1 available) 9733 1726773075.48311: exiting _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config 9733 1726773075.48322: done queuing things up, now waiting for results queue to drain 9733 1726773075.48325: waiting for pending results... 10476 1726773075.48459: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 10476 1726773075.48575: in run() - task 0affffe7-6841-7dd6-8fa6-000000000405 10476 1726773075.48595: variable 'ansible_search_path' from source: unknown 10476 1726773075.48600: variable 'ansible_search_path' from source: unknown 10476 1726773075.48629: calling self._execute() 10476 1726773075.48699: variable 'ansible_host' from source: host vars for 'managed_node3' 10476 1726773075.48708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10476 1726773075.48727: variable 'omit' from source: magic vars 10476 1726773075.48829: variable 'omit' from source: magic vars 10476 1726773075.48870: variable 'omit' from source: magic vars 10476 1726773075.48901: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 10476 1726773075.49206: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 10476 1726773075.49264: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10476 1726773075.49299: variable 'omit' from source: magic vars 10476 1726773075.49338: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10476 1726773075.49375: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10476 1726773075.49408: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10476 1726773075.49427: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10476 1726773075.49440: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10476 1726773075.49467: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10476 1726773075.49476: variable 'ansible_host' from source: host vars for 'managed_node3' 10476 1726773075.49480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10476 1726773075.49577: Set connection var ansible_timeout to 10 10476 1726773075.49582: Set connection var ansible_shell_type to sh 10476 1726773075.49590: Set connection var ansible_module_compression to ZIP_DEFLATED 10476 1726773075.49595: Set connection var ansible_shell_executable to /bin/sh 10476 1726773075.49600: Set connection var ansible_pipelining to False 10476 1726773075.49607: Set connection var ansible_connection to ssh 10476 1726773075.49625: variable 'ansible_shell_executable' from source: unknown 10476 1726773075.49629: variable 'ansible_connection' from source: unknown 10476 1726773075.49631: variable 'ansible_module_compression' from source: unknown 10476 1726773075.49634: variable 'ansible_shell_type' from source: unknown 10476 1726773075.49636: variable 'ansible_shell_executable' from source: unknown 10476 1726773075.49639: variable 'ansible_host' from source: host vars for 'managed_node3' 10476 1726773075.49642: variable 'ansible_pipelining' from source: unknown 10476 1726773075.49644: variable 'ansible_timeout' from source: unknown 10476 1726773075.49648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10476 1726773075.49820: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10476 1726773075.49832: variable 'omit' from source: magic vars 10476 1726773075.49842: starting attempt loop 10476 1726773075.49844: running the handler 10476 1726773075.49854: _low_level_execute_command(): starting 10476 1726773075.49860: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10476 1726773075.52220: stdout chunk (state=2): >>>/root <<< 10476 1726773075.52352: stderr chunk (state=3): >>><<< 10476 1726773075.52359: stdout chunk (state=3): >>><<< 10476 1726773075.52380: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10476 1726773075.52395: _low_level_execute_command(): starting 10476 1726773075.52402: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773075.5238972-10476-218505412082139 `" && echo ansible-tmp-1726773075.5238972-10476-218505412082139="` echo /root/.ansible/tmp/ansible-tmp-1726773075.5238972-10476-218505412082139 `" ) && sleep 0' 10476 1726773075.55197: stdout chunk (state=2): >>>ansible-tmp-1726773075.5238972-10476-218505412082139=/root/.ansible/tmp/ansible-tmp-1726773075.5238972-10476-218505412082139 <<< 10476 1726773075.55346: stderr chunk (state=3): >>><<< 10476 1726773075.55354: stdout chunk (state=3): >>><<< 10476 1726773075.55376: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773075.5238972-10476-218505412082139=/root/.ansible/tmp/ansible-tmp-1726773075.5238972-10476-218505412082139 , stderr= 10476 1726773075.55421: variable 'ansible_module_compression' from source: unknown 10476 1726773075.55455: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 10476 1726773075.55493: variable 'ansible_facts' from source: unknown 10476 1726773075.55598: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773075.5238972-10476-218505412082139/AnsiballZ_kernel_settings_get_config.py 10476 1726773075.56065: Sending initial data 10476 1726773075.56075: Sent initial data (174 bytes) 10476 1726773075.58534: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmp0d5l7yz2 /root/.ansible/tmp/ansible-tmp-1726773075.5238972-10476-218505412082139/AnsiballZ_kernel_settings_get_config.py <<< 10476 1726773075.59873: stderr chunk (state=3): >>><<< 10476 1726773075.59886: stdout chunk (state=3): >>><<< 10476 1726773075.59909: done transferring module to remote 10476 1726773075.59921: _low_level_execute_command(): starting 10476 1726773075.59926: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773075.5238972-10476-218505412082139/ /root/.ansible/tmp/ansible-tmp-1726773075.5238972-10476-218505412082139/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10476 1726773075.62390: stderr chunk (state=2): >>><<< 10476 1726773075.62398: stdout chunk (state=2): >>><<< 10476 1726773075.62411: _low_level_execute_command() done: rc=0, stdout=, stderr= 10476 1726773075.62415: _low_level_execute_command(): starting 10476 1726773075.62420: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773075.5238972-10476-218505412082139/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10476 1726773075.78421: stdout chunk (state=2): >>> {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} <<< 10476 1726773075.79514: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10476 1726773075.79562: stderr chunk (state=3): >>><<< 10476 1726773075.79568: stdout chunk (state=3): >>><<< 10476 1726773075.79587: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} , stderr=Shared connection to 10.31.47.99 closed. 10476 1726773075.79615: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/tuned-main.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773075.5238972-10476-218505412082139/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10476 1726773075.79627: _low_level_execute_command(): starting 10476 1726773075.79632: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773075.5238972-10476-218505412082139/ > /dev/null 2>&1 && sleep 0' 10476 1726773075.82133: stderr chunk (state=2): >>><<< 10476 1726773075.82143: stdout chunk (state=2): >>><<< 10476 1726773075.82160: _low_level_execute_command() done: rc=0, stdout=, stderr= 10476 1726773075.82168: handler run complete 10476 1726773075.82190: attempt loop complete, returning result 10476 1726773075.82196: _execute() done 10476 1726773075.82199: dumping result to json 10476 1726773075.82204: done dumping result, returning 10476 1726773075.82212: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config [0affffe7-6841-7dd6-8fa6-000000000405] 10476 1726773075.82218: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000405 10476 1726773075.82258: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000405 10476 1726773075.82262: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "data": { "daemon": "1", "default_instance_priority": "0", "dynamic_tuning": "0", "log_file_count": "2", "log_file_max_size": "1MB", "reapply_sysctl": "1", "recommend_command": "1", "sleep_interval": "1", "udev_buffer_size": "1MB", "update_interval": "10" } } 9733 1726773075.82697: no more pending results, returning what we have 9733 1726773075.82700: results queue empty 9733 1726773075.82701: checking for any_errors_fatal 9733 1726773075.82708: done checking for any_errors_fatal 9733 1726773075.82709: checking for max_fail_percentage 9733 1726773075.82710: done checking for max_fail_percentage 9733 1726773075.82711: checking to see if all hosts have failed and the running result is not ok 9733 1726773075.82711: done checking to see if all hosts have failed 9733 1726773075.82712: getting the remaining hosts for this loop 9733 1726773075.82713: done getting the remaining hosts for this loop 9733 1726773075.82716: getting the next task for host managed_node3 9733 1726773075.82722: done getting next task for host managed_node3 9733 1726773075.82726: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 9733 1726773075.82728: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773075.82739: getting variables 9733 1726773075.82740: in VariableManager get_vars() 9733 1726773075.82780: Calling all_inventory to load vars for managed_node3 9733 1726773075.82783: Calling groups_inventory to load vars for managed_node3 9733 1726773075.82787: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773075.82797: Calling all_plugins_play to load vars for managed_node3 9733 1726773075.82800: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773075.82803: Calling groups_plugins_play to load vars for managed_node3 9733 1726773075.83027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773075.83225: done with get_vars() 9733 1726773075.83235: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 Thursday 19 September 2024 15:11:15 -0400 (0:00:00.351) 0:00:21.565 **** 9733 1726773075.83320: entering _queue_task() for managed_node3/stat 9733 1726773075.83509: worker is 1 (out of 1 available) 9733 1726773075.83522: exiting _queue_task() for managed_node3/stat 9733 1726773075.83532: done queuing things up, now waiting for results queue to drain 9733 1726773075.83534: waiting for pending results... 10491 1726773075.83754: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 10491 1726773075.83880: in run() - task 0affffe7-6841-7dd6-8fa6-000000000406 10491 1726773075.83899: variable 'ansible_search_path' from source: unknown 10491 1726773075.83903: variable 'ansible_search_path' from source: unknown 10491 1726773075.83946: variable '__prof_from_conf' from source: task vars 10491 1726773075.84247: variable '__prof_from_conf' from source: task vars 10491 1726773075.84437: variable '__data' from source: task vars 10491 1726773075.84512: variable '__kernel_settings_register_tuned_main' from source: set_fact 10491 1726773075.84759: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10491 1726773075.84773: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10491 1726773075.84834: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10491 1726773075.84851: variable 'omit' from source: magic vars 10491 1726773075.84962: variable 'ansible_host' from source: host vars for 'managed_node3' 10491 1726773075.84977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10491 1726773075.84990: variable 'omit' from source: magic vars 10491 1726773075.85276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10491 1726773075.87529: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10491 1726773075.87596: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10491 1726773075.87635: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10491 1726773075.87690: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10491 1726773075.87716: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10491 1726773075.87796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10491 1726773075.87826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10491 1726773075.87852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10491 1726773075.87913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10491 1726773075.87929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10491 1726773075.88034: variable 'item' from source: unknown 10491 1726773075.88049: Evaluated conditional (item | length > 0): False 10491 1726773075.88054: when evaluation is False, skipping this task 10491 1726773075.88093: variable 'item' from source: unknown 10491 1726773075.88164: variable 'item' from source: unknown skipping: [managed_node3] => (item=) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item | length > 0", "item": "", "skip_reason": "Conditional result was False" } 10491 1726773075.88252: variable 'ansible_host' from source: host vars for 'managed_node3' 10491 1726773075.88262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10491 1726773075.88275: variable 'omit' from source: magic vars 10491 1726773075.88429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10491 1726773075.88454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10491 1726773075.88481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10491 1726773075.88522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10491 1726773075.88537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10491 1726773075.88618: variable 'item' from source: unknown 10491 1726773075.88627: Evaluated conditional (item | length > 0): True 10491 1726773075.88633: variable 'omit' from source: magic vars 10491 1726773075.88676: variable 'omit' from source: magic vars 10491 1726773075.88720: variable 'item' from source: unknown 10491 1726773075.88784: variable 'item' from source: unknown 10491 1726773075.88803: variable 'omit' from source: magic vars 10491 1726773075.88828: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10491 1726773075.88853: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10491 1726773075.88875: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10491 1726773075.88894: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10491 1726773075.88905: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10491 1726773075.88933: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10491 1726773075.88939: variable 'ansible_host' from source: host vars for 'managed_node3' 10491 1726773075.88943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10491 1726773075.89279: Set connection var ansible_timeout to 10 10491 1726773075.89287: Set connection var ansible_shell_type to sh 10491 1726773075.89294: Set connection var ansible_module_compression to ZIP_DEFLATED 10491 1726773075.89300: Set connection var ansible_shell_executable to /bin/sh 10491 1726773075.89305: Set connection var ansible_pipelining to False 10491 1726773075.89312: Set connection var ansible_connection to ssh 10491 1726773075.89332: variable 'ansible_shell_executable' from source: unknown 10491 1726773075.89337: variable 'ansible_connection' from source: unknown 10491 1726773075.89341: variable 'ansible_module_compression' from source: unknown 10491 1726773075.89344: variable 'ansible_shell_type' from source: unknown 10491 1726773075.89347: variable 'ansible_shell_executable' from source: unknown 10491 1726773075.89350: variable 'ansible_host' from source: host vars for 'managed_node3' 10491 1726773075.89354: variable 'ansible_pipelining' from source: unknown 10491 1726773075.89357: variable 'ansible_timeout' from source: unknown 10491 1726773075.89361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10491 1726773075.89498: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10491 1726773075.89509: variable 'omit' from source: magic vars 10491 1726773075.89515: starting attempt loop 10491 1726773075.89518: running the handler 10491 1726773075.89530: _low_level_execute_command(): starting 10491 1726773075.89537: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10491 1726773075.92148: stdout chunk (state=2): >>>/root <<< 10491 1726773075.92281: stderr chunk (state=3): >>><<< 10491 1726773075.92290: stdout chunk (state=3): >>><<< 10491 1726773075.92312: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10491 1726773075.92324: _low_level_execute_command(): starting 10491 1726773075.92330: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773075.9231966-10491-175196891130888 `" && echo ansible-tmp-1726773075.9231966-10491-175196891130888="` echo /root/.ansible/tmp/ansible-tmp-1726773075.9231966-10491-175196891130888 `" ) && sleep 0' 10491 1726773075.94874: stdout chunk (state=2): >>>ansible-tmp-1726773075.9231966-10491-175196891130888=/root/.ansible/tmp/ansible-tmp-1726773075.9231966-10491-175196891130888 <<< 10491 1726773075.95007: stderr chunk (state=3): >>><<< 10491 1726773075.95016: stdout chunk (state=3): >>><<< 10491 1726773075.95031: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773075.9231966-10491-175196891130888=/root/.ansible/tmp/ansible-tmp-1726773075.9231966-10491-175196891130888 , stderr= 10491 1726773075.95069: variable 'ansible_module_compression' from source: unknown 10491 1726773075.95112: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10491 1726773075.95140: variable 'ansible_facts' from source: unknown 10491 1726773075.95213: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773075.9231966-10491-175196891130888/AnsiballZ_stat.py 10491 1726773075.95332: Sending initial data 10491 1726773075.95339: Sent initial data (152 bytes) 10491 1726773075.98063: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpr24qhyb1 /root/.ansible/tmp/ansible-tmp-1726773075.9231966-10491-175196891130888/AnsiballZ_stat.py <<< 10491 1726773076.00594: stderr chunk (state=3): >>><<< 10491 1726773076.00605: stdout chunk (state=3): >>><<< 10491 1726773076.00641: done transferring module to remote 10491 1726773076.00654: _low_level_execute_command(): starting 10491 1726773076.00659: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773075.9231966-10491-175196891130888/ /root/.ansible/tmp/ansible-tmp-1726773075.9231966-10491-175196891130888/AnsiballZ_stat.py && sleep 0' 10491 1726773076.03317: stderr chunk (state=2): >>><<< 10491 1726773076.03326: stdout chunk (state=2): >>><<< 10491 1726773076.03340: _low_level_execute_command() done: rc=0, stdout=, stderr= 10491 1726773076.03344: _low_level_execute_command(): starting 10491 1726773076.03350: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773075.9231966-10491-175196891130888/AnsiballZ_stat.py && sleep 0' 10491 1726773076.18309: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 10491 1726773076.19356: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10491 1726773076.19405: stderr chunk (state=3): >>><<< 10491 1726773076.19412: stdout chunk (state=3): >>><<< 10491 1726773076.19430: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.47.99 closed. 10491 1726773076.19452: done with _execute_module (stat, {'path': '/etc/tuned/profiles', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773075.9231966-10491-175196891130888/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10491 1726773076.19463: _low_level_execute_command(): starting 10491 1726773076.19468: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773075.9231966-10491-175196891130888/ > /dev/null 2>&1 && sleep 0' 10491 1726773076.21945: stderr chunk (state=2): >>><<< 10491 1726773076.21954: stdout chunk (state=2): >>><<< 10491 1726773076.21968: _low_level_execute_command() done: rc=0, stdout=, stderr= 10491 1726773076.21977: handler run complete 10491 1726773076.21994: attempt loop complete, returning result 10491 1726773076.22012: variable 'item' from source: unknown 10491 1726773076.22074: variable 'item' from source: unknown ok: [managed_node3] => (item=/etc/tuned/profiles) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned/profiles", "stat": { "exists": false } } 10491 1726773076.22159: variable 'ansible_host' from source: host vars for 'managed_node3' 10491 1726773076.22170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10491 1726773076.22183: variable 'omit' from source: magic vars 10491 1726773076.22295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10491 1726773076.22318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10491 1726773076.22336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10491 1726773076.22364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10491 1726773076.22378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10491 1726773076.22437: variable 'item' from source: unknown 10491 1726773076.22446: Evaluated conditional (item | length > 0): True 10491 1726773076.22451: variable 'omit' from source: magic vars 10491 1726773076.22462: variable 'omit' from source: magic vars 10491 1726773076.22496: variable 'item' from source: unknown 10491 1726773076.22542: variable 'item' from source: unknown 10491 1726773076.22556: variable 'omit' from source: magic vars 10491 1726773076.22573: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10491 1726773076.22581: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10491 1726773076.22589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10491 1726773076.22601: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10491 1726773076.22605: variable 'ansible_host' from source: host vars for 'managed_node3' 10491 1726773076.22609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10491 1726773076.22658: Set connection var ansible_timeout to 10 10491 1726773076.22663: Set connection var ansible_shell_type to sh 10491 1726773076.22668: Set connection var ansible_module_compression to ZIP_DEFLATED 10491 1726773076.22673: Set connection var ansible_shell_executable to /bin/sh 10491 1726773076.22679: Set connection var ansible_pipelining to False 10491 1726773076.22686: Set connection var ansible_connection to ssh 10491 1726773076.22701: variable 'ansible_shell_executable' from source: unknown 10491 1726773076.22704: variable 'ansible_connection' from source: unknown 10491 1726773076.22707: variable 'ansible_module_compression' from source: unknown 10491 1726773076.22710: variable 'ansible_shell_type' from source: unknown 10491 1726773076.22713: variable 'ansible_shell_executable' from source: unknown 10491 1726773076.22717: variable 'ansible_host' from source: host vars for 'managed_node3' 10491 1726773076.22721: variable 'ansible_pipelining' from source: unknown 10491 1726773076.22724: variable 'ansible_timeout' from source: unknown 10491 1726773076.22728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10491 1726773076.22797: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10491 1726773076.22807: variable 'omit' from source: magic vars 10491 1726773076.22813: starting attempt loop 10491 1726773076.22816: running the handler 10491 1726773076.22823: _low_level_execute_command(): starting 10491 1726773076.22827: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10491 1726773076.25055: stdout chunk (state=2): >>>/root <<< 10491 1726773076.25182: stderr chunk (state=3): >>><<< 10491 1726773076.25191: stdout chunk (state=3): >>><<< 10491 1726773076.25207: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10491 1726773076.25216: _low_level_execute_command(): starting 10491 1726773076.25222: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773076.2521343-10491-136913764979813 `" && echo ansible-tmp-1726773076.2521343-10491-136913764979813="` echo /root/.ansible/tmp/ansible-tmp-1726773076.2521343-10491-136913764979813 `" ) && sleep 0' 10491 1726773076.27717: stdout chunk (state=2): >>>ansible-tmp-1726773076.2521343-10491-136913764979813=/root/.ansible/tmp/ansible-tmp-1726773076.2521343-10491-136913764979813 <<< 10491 1726773076.27843: stderr chunk (state=3): >>><<< 10491 1726773076.27851: stdout chunk (state=3): >>><<< 10491 1726773076.27866: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773076.2521343-10491-136913764979813=/root/.ansible/tmp/ansible-tmp-1726773076.2521343-10491-136913764979813 , stderr= 10491 1726773076.27901: variable 'ansible_module_compression' from source: unknown 10491 1726773076.27936: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10491 1726773076.27954: variable 'ansible_facts' from source: unknown 10491 1726773076.28011: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773076.2521343-10491-136913764979813/AnsiballZ_stat.py 10491 1726773076.28100: Sending initial data 10491 1726773076.28107: Sent initial data (152 bytes) 10491 1726773076.30651: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpk49it0vw /root/.ansible/tmp/ansible-tmp-1726773076.2521343-10491-136913764979813/AnsiballZ_stat.py <<< 10491 1726773076.31833: stderr chunk (state=3): >>><<< 10491 1726773076.31840: stdout chunk (state=3): >>><<< 10491 1726773076.31859: done transferring module to remote 10491 1726773076.31868: _low_level_execute_command(): starting 10491 1726773076.31875: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773076.2521343-10491-136913764979813/ /root/.ansible/tmp/ansible-tmp-1726773076.2521343-10491-136913764979813/AnsiballZ_stat.py && sleep 0' 10491 1726773076.34263: stderr chunk (state=2): >>><<< 10491 1726773076.34270: stdout chunk (state=2): >>><<< 10491 1726773076.34288: _low_level_execute_command() done: rc=0, stdout=, stderr= 10491 1726773076.34292: _low_level_execute_command(): starting 10491 1726773076.34299: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773076.2521343-10491-136913764979813/AnsiballZ_stat.py && sleep 0' 10491 1726773076.50170: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726773065.3941085, "mtime": 1726773063.682102, "ctime": 1726773063.682102, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 10491 1726773076.51292: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10491 1726773076.51334: stderr chunk (state=3): >>><<< 10491 1726773076.51341: stdout chunk (state=3): >>><<< 10491 1726773076.51355: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726773065.3941085, "mtime": 1726773063.682102, "ctime": 1726773063.682102, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.47.99 closed. 10491 1726773076.51394: done with _execute_module (stat, {'path': '/etc/tuned', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773076.2521343-10491-136913764979813/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10491 1726773076.51402: _low_level_execute_command(): starting 10491 1726773076.51408: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773076.2521343-10491-136913764979813/ > /dev/null 2>&1 && sleep 0' 10491 1726773076.53847: stderr chunk (state=2): >>><<< 10491 1726773076.53854: stdout chunk (state=2): >>><<< 10491 1726773076.53867: _low_level_execute_command() done: rc=0, stdout=, stderr= 10491 1726773076.53873: handler run complete 10491 1726773076.53905: attempt loop complete, returning result 10491 1726773076.53921: variable 'item' from source: unknown 10491 1726773076.53980: variable 'item' from source: unknown ok: [managed_node3] => (item=/etc/tuned) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned", "stat": { "atime": 1726773065.3941085, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1726773063.682102, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 917919, "isblk": false, "ischr": false, "isdir": true, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/directory", "mode": "0755", "mtime": 1726773063.682102, "nlink": 4, "path": "/etc/tuned", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 159, "uid": 0, "version": "1785990601", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 10491 1726773076.54024: dumping result to json 10491 1726773076.54034: done dumping result, returning 10491 1726773076.54042: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory [0affffe7-6841-7dd6-8fa6-000000000406] 10491 1726773076.54048: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000406 10491 1726773076.54090: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000406 10491 1726773076.54094: WORKER PROCESS EXITING 9733 1726773076.54270: no more pending results, returning what we have 9733 1726773076.54275: results queue empty 9733 1726773076.54276: checking for any_errors_fatal 9733 1726773076.54281: done checking for any_errors_fatal 9733 1726773076.54281: checking for max_fail_percentage 9733 1726773076.54283: done checking for max_fail_percentage 9733 1726773076.54283: checking to see if all hosts have failed and the running result is not ok 9733 1726773076.54284: done checking to see if all hosts have failed 9733 1726773076.54284: getting the remaining hosts for this loop 9733 1726773076.54287: done getting the remaining hosts for this loop 9733 1726773076.54290: getting the next task for host managed_node3 9733 1726773076.54295: done getting next task for host managed_node3 9733 1726773076.54298: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 9733 1726773076.54300: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773076.54310: getting variables 9733 1726773076.54311: in VariableManager get_vars() 9733 1726773076.54344: Calling all_inventory to load vars for managed_node3 9733 1726773076.54347: Calling groups_inventory to load vars for managed_node3 9733 1726773076.54348: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773076.54357: Calling all_plugins_play to load vars for managed_node3 9733 1726773076.54359: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773076.54362: Calling groups_plugins_play to load vars for managed_node3 9733 1726773076.54467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773076.54587: done with get_vars() 9733 1726773076.54596: done getting variables 9733 1726773076.54638: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:63 Thursday 19 September 2024 15:11:16 -0400 (0:00:00.713) 0:00:22.279 **** 9733 1726773076.54658: entering _queue_task() for managed_node3/set_fact 9733 1726773076.54824: worker is 1 (out of 1 available) 9733 1726773076.54841: exiting _queue_task() for managed_node3/set_fact 9733 1726773076.54852: done queuing things up, now waiting for results queue to drain 9733 1726773076.54854: waiting for pending results... 10524 1726773076.54977: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 10524 1726773076.55092: in run() - task 0affffe7-6841-7dd6-8fa6-000000000407 10524 1726773076.55108: variable 'ansible_search_path' from source: unknown 10524 1726773076.55113: variable 'ansible_search_path' from source: unknown 10524 1726773076.55141: calling self._execute() 10524 1726773076.55212: variable 'ansible_host' from source: host vars for 'managed_node3' 10524 1726773076.55222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10524 1726773076.55231: variable 'omit' from source: magic vars 10524 1726773076.55377: variable 'omit' from source: magic vars 10524 1726773076.55415: variable 'omit' from source: magic vars 10524 1726773076.55726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10524 1726773076.57209: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10524 1726773076.57263: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10524 1726773076.57296: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10524 1726773076.57322: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10524 1726773076.57342: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10524 1726773076.57402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10524 1726773076.57423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10524 1726773076.57442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10524 1726773076.57469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10524 1726773076.57486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10524 1726773076.57520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10524 1726773076.57537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10524 1726773076.57554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10524 1726773076.57584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10524 1726773076.57598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10524 1726773076.57637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10524 1726773076.57655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10524 1726773076.57674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10524 1726773076.57703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10524 1726773076.57715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10524 1726773076.57865: variable '__kernel_settings_find_profile_dirs' from source: set_fact 10524 1726773076.57938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10524 1726773076.58047: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10524 1726773076.58076: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10524 1726773076.58101: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10524 1726773076.58123: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10524 1726773076.58153: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10524 1726773076.58170: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10524 1726773076.58192: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10524 1726773076.58211: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10524 1726773076.58247: variable 'omit' from source: magic vars 10524 1726773076.58268: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10524 1726773076.58291: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10524 1726773076.58306: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10524 1726773076.58320: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10524 1726773076.58329: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10524 1726773076.58354: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10524 1726773076.58359: variable 'ansible_host' from source: host vars for 'managed_node3' 10524 1726773076.58363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10524 1726773076.58430: Set connection var ansible_timeout to 10 10524 1726773076.58436: Set connection var ansible_shell_type to sh 10524 1726773076.58441: Set connection var ansible_module_compression to ZIP_DEFLATED 10524 1726773076.58447: Set connection var ansible_shell_executable to /bin/sh 10524 1726773076.58453: Set connection var ansible_pipelining to False 10524 1726773076.58460: Set connection var ansible_connection to ssh 10524 1726773076.58480: variable 'ansible_shell_executable' from source: unknown 10524 1726773076.58484: variable 'ansible_connection' from source: unknown 10524 1726773076.58490: variable 'ansible_module_compression' from source: unknown 10524 1726773076.58493: variable 'ansible_shell_type' from source: unknown 10524 1726773076.58497: variable 'ansible_shell_executable' from source: unknown 10524 1726773076.58500: variable 'ansible_host' from source: host vars for 'managed_node3' 10524 1726773076.58504: variable 'ansible_pipelining' from source: unknown 10524 1726773076.58507: variable 'ansible_timeout' from source: unknown 10524 1726773076.58512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10524 1726773076.58576: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10524 1726773076.58589: variable 'omit' from source: magic vars 10524 1726773076.58596: starting attempt loop 10524 1726773076.58599: running the handler 10524 1726773076.58609: handler run complete 10524 1726773076.58616: attempt loop complete, returning result 10524 1726773076.58619: _execute() done 10524 1726773076.58622: dumping result to json 10524 1726773076.58626: done dumping result, returning 10524 1726773076.58632: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir [0affffe7-6841-7dd6-8fa6-000000000407] 10524 1726773076.58637: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000407 10524 1726773076.58656: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000407 10524 1726773076.58659: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__kernel_settings_profile_parent": "/etc/tuned" }, "changed": false } 9733 1726773076.58780: no more pending results, returning what we have 9733 1726773076.58783: results queue empty 9733 1726773076.58783: checking for any_errors_fatal 9733 1726773076.58795: done checking for any_errors_fatal 9733 1726773076.58796: checking for max_fail_percentage 9733 1726773076.58797: done checking for max_fail_percentage 9733 1726773076.58798: checking to see if all hosts have failed and the running result is not ok 9733 1726773076.58798: done checking to see if all hosts have failed 9733 1726773076.58799: getting the remaining hosts for this loop 9733 1726773076.58800: done getting the remaining hosts for this loop 9733 1726773076.58803: getting the next task for host managed_node3 9733 1726773076.58808: done getting next task for host managed_node3 9733 1726773076.58811: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 9733 1726773076.58813: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773076.58823: getting variables 9733 1726773076.58824: in VariableManager get_vars() 9733 1726773076.58858: Calling all_inventory to load vars for managed_node3 9733 1726773076.58861: Calling groups_inventory to load vars for managed_node3 9733 1726773076.58863: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773076.58872: Calling all_plugins_play to load vars for managed_node3 9733 1726773076.58874: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773076.58877: Calling groups_plugins_play to load vars for managed_node3 9733 1726773076.59038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773076.59152: done with get_vars() 9733 1726773076.59159: done getting variables 9733 1726773076.59202: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 Thursday 19 September 2024 15:11:16 -0400 (0:00:00.045) 0:00:22.324 **** 9733 1726773076.59223: entering _queue_task() for managed_node3/service 9733 1726773076.59379: worker is 1 (out of 1 available) 9733 1726773076.59396: exiting _queue_task() for managed_node3/service 9733 1726773076.59408: done queuing things up, now waiting for results queue to drain 9733 1726773076.59409: waiting for pending results... 10525 1726773076.59537: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 10525 1726773076.59651: in run() - task 0affffe7-6841-7dd6-8fa6-000000000408 10525 1726773076.59667: variable 'ansible_search_path' from source: unknown 10525 1726773076.59671: variable 'ansible_search_path' from source: unknown 10525 1726773076.59709: variable '__kernel_settings_services' from source: include_vars 10525 1726773076.59942: variable '__kernel_settings_services' from source: include_vars 10525 1726773076.60003: variable 'omit' from source: magic vars 10525 1726773076.60091: variable 'ansible_host' from source: host vars for 'managed_node3' 10525 1726773076.60102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10525 1726773076.60110: variable 'omit' from source: magic vars 10525 1726773076.60162: variable 'omit' from source: magic vars 10525 1726773076.60198: variable 'omit' from source: magic vars 10525 1726773076.60228: variable 'item' from source: unknown 10525 1726773076.60287: variable 'item' from source: unknown 10525 1726773076.60308: variable 'omit' from source: magic vars 10525 1726773076.60337: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10525 1726773076.60364: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10525 1726773076.60386: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10525 1726773076.60401: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10525 1726773076.60411: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10525 1726773076.60433: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10525 1726773076.60438: variable 'ansible_host' from source: host vars for 'managed_node3' 10525 1726773076.60442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10525 1726773076.60512: Set connection var ansible_timeout to 10 10525 1726773076.60517: Set connection var ansible_shell_type to sh 10525 1726773076.60523: Set connection var ansible_module_compression to ZIP_DEFLATED 10525 1726773076.60529: Set connection var ansible_shell_executable to /bin/sh 10525 1726773076.60535: Set connection var ansible_pipelining to False 10525 1726773076.60541: Set connection var ansible_connection to ssh 10525 1726773076.60553: variable 'ansible_shell_executable' from source: unknown 10525 1726773076.60557: variable 'ansible_connection' from source: unknown 10525 1726773076.60561: variable 'ansible_module_compression' from source: unknown 10525 1726773076.60565: variable 'ansible_shell_type' from source: unknown 10525 1726773076.60568: variable 'ansible_shell_executable' from source: unknown 10525 1726773076.60574: variable 'ansible_host' from source: host vars for 'managed_node3' 10525 1726773076.60578: variable 'ansible_pipelining' from source: unknown 10525 1726773076.60582: variable 'ansible_timeout' from source: unknown 10525 1726773076.60587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10525 1726773076.60676: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10525 1726773076.60693: variable 'omit' from source: magic vars 10525 1726773076.60699: starting attempt loop 10525 1726773076.60703: running the handler 10525 1726773076.60764: variable 'ansible_facts' from source: unknown 10525 1726773076.60853: _low_level_execute_command(): starting 10525 1726773076.60862: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10525 1726773076.63236: stdout chunk (state=2): >>>/root <<< 10525 1726773076.63354: stderr chunk (state=3): >>><<< 10525 1726773076.63361: stdout chunk (state=3): >>><<< 10525 1726773076.63379: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10525 1726773076.63393: _low_level_execute_command(): starting 10525 1726773076.63399: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773076.6338787-10525-260102328518720 `" && echo ansible-tmp-1726773076.6338787-10525-260102328518720="` echo /root/.ansible/tmp/ansible-tmp-1726773076.6338787-10525-260102328518720 `" ) && sleep 0' 10525 1726773076.65894: stdout chunk (state=2): >>>ansible-tmp-1726773076.6338787-10525-260102328518720=/root/.ansible/tmp/ansible-tmp-1726773076.6338787-10525-260102328518720 <<< 10525 1726773076.66024: stderr chunk (state=3): >>><<< 10525 1726773076.66032: stdout chunk (state=3): >>><<< 10525 1726773076.66047: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773076.6338787-10525-260102328518720=/root/.ansible/tmp/ansible-tmp-1726773076.6338787-10525-260102328518720 , stderr= 10525 1726773076.66076: variable 'ansible_module_compression' from source: unknown 10525 1726773076.66120: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 10525 1726773076.66172: variable 'ansible_facts' from source: unknown 10525 1726773076.66357: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773076.6338787-10525-260102328518720/AnsiballZ_systemd.py 10525 1726773076.66469: Sending initial data 10525 1726773076.66476: Sent initial data (155 bytes) 10525 1726773076.69073: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmp2ptbaqkg /root/.ansible/tmp/ansible-tmp-1726773076.6338787-10525-260102328518720/AnsiballZ_systemd.py <<< 10525 1726773076.71412: stderr chunk (state=3): >>><<< 10525 1726773076.71421: stdout chunk (state=3): >>><<< 10525 1726773076.71441: done transferring module to remote 10525 1726773076.71452: _low_level_execute_command(): starting 10525 1726773076.71457: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773076.6338787-10525-260102328518720/ /root/.ansible/tmp/ansible-tmp-1726773076.6338787-10525-260102328518720/AnsiballZ_systemd.py && sleep 0' 10525 1726773076.74150: stderr chunk (state=2): >>><<< 10525 1726773076.74160: stdout chunk (state=2): >>><<< 10525 1726773076.74178: _low_level_execute_command() done: rc=0, stdout=, stderr= 10525 1726773076.74183: _low_level_execute_command(): starting 10525 1726773076.74191: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773076.6338787-10525-260102328518720/AnsiballZ_systemd.py && sleep 0' 10525 1726773077.02512: stdout chunk (state=2): >>> {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:11:05 EDT", "WatchdogTimestampMonotonic": "480455087", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "15004", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ExecMainStartTimestampMonotonic": "480313127", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "15004", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:11:05 EDT] ; stop_time=[n/a] ; pid=15004 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "15097856", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "Memo<<< 10525 1726773077.02551: stdout chunk (state=3): >>>ryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "power-profiles-daemon.service auto-cpufreq.service tlp.service cpupower.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "systemd-journald.socket polkit.service systemd-sysctl.service dbus.socket basic.target system.slice network.target dbus.service sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:11:05 EDT", "S<<< 10525 1726773077.02564: stdout chunk (state=3): >>>tateChangeTimestampMonotonic": "480455091", "InactiveExitTimestamp": "Thu 2024-09-19 15:11:05 EDT", "InactiveExitTimestampMonotonic": "480313185", "ActiveEnterTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ActiveEnterTimestampMonotonic": "480455091", "ActiveExitTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ActiveExitTimestampMonotonic": "480218270", "InactiveEnterTimestamp": "Thu 2024-09-19 15:11:05 EDT", "InactiveEnterTimestampMonotonic": "480310218", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ConditionTimestampMonotonic": "480311469", "AssertTimestamp": "Thu 2024-09-19 15:11:05 EDT", "AssertTimestampMonotonic": "480311470", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5c390172c7314a188777ca74147bd412", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 10525 1726773077.04248: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10525 1726773077.04295: stderr chunk (state=3): >>><<< 10525 1726773077.04302: stdout chunk (state=3): >>><<< 10525 1726773077.04322: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:11:05 EDT", "WatchdogTimestampMonotonic": "480455087", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "15004", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ExecMainStartTimestampMonotonic": "480313127", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "15004", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:11:05 EDT] ; stop_time=[n/a] ; pid=15004 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "15097856", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "power-profiles-daemon.service auto-cpufreq.service tlp.service cpupower.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "systemd-journald.socket polkit.service systemd-sysctl.service dbus.socket basic.target system.slice network.target dbus.service sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:11:05 EDT", "StateChangeTimestampMonotonic": "480455091", "InactiveExitTimestamp": "Thu 2024-09-19 15:11:05 EDT", "InactiveExitTimestampMonotonic": "480313185", "ActiveEnterTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ActiveEnterTimestampMonotonic": "480455091", "ActiveExitTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ActiveExitTimestampMonotonic": "480218270", "InactiveEnterTimestamp": "Thu 2024-09-19 15:11:05 EDT", "InactiveEnterTimestampMonotonic": "480310218", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ConditionTimestampMonotonic": "480311469", "AssertTimestamp": "Thu 2024-09-19 15:11:05 EDT", "AssertTimestampMonotonic": "480311470", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5c390172c7314a188777ca74147bd412", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.47.99 closed. 10525 1726773077.04430: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773076.6338787-10525-260102328518720/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10525 1726773077.04450: _low_level_execute_command(): starting 10525 1726773077.04457: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773076.6338787-10525-260102328518720/ > /dev/null 2>&1 && sleep 0' 10525 1726773077.06992: stderr chunk (state=2): >>><<< 10525 1726773077.07001: stdout chunk (state=2): >>><<< 10525 1726773077.07016: _low_level_execute_command() done: rc=0, stdout=, stderr= 10525 1726773077.07023: handler run complete 10525 1726773077.07055: attempt loop complete, returning result 10525 1726773077.07075: variable 'item' from source: unknown 10525 1726773077.07135: variable 'item' from source: unknown ok: [managed_node3] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ActiveEnterTimestampMonotonic": "480455091", "ActiveExitTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ActiveExitTimestampMonotonic": "480218270", "ActiveState": "active", "After": "systemd-journald.socket polkit.service systemd-sysctl.service dbus.socket basic.target system.slice network.target dbus.service sysinit.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 15:11:05 EDT", "AssertTimestampMonotonic": "480311470", "Before": "shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ConditionTimestampMonotonic": "480311469", "ConfigurationDirectoryMode": "0755", "Conflicts": "power-profiles-daemon.service auto-cpufreq.service tlp.service cpupower.service shutdown.target", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "15004", "ExecMainStartTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ExecMainStartTimestampMonotonic": "480313127", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:11:05 EDT] ; stop_time=[n/a] ; pid=15004 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestamp": "Thu 2024-09-19 15:11:05 EDT", "InactiveEnterTimestampMonotonic": "480310218", "InactiveExitTimestamp": "Thu 2024-09-19 15:11:05 EDT", "InactiveExitTimestampMonotonic": "480313185", "InvocationID": "5c390172c7314a188777ca74147bd412", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "15004", "MemoryAccounting": "yes", "MemoryCurrent": "15097856", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 15:11:05 EDT", "StateChangeTimestampMonotonic": "480455091", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 15:11:05 EDT", "WatchdogTimestampMonotonic": "480455087", "WatchdogUSec": "0" } } 10525 1726773077.07259: dumping result to json 10525 1726773077.07280: done dumping result, returning 10525 1726773077.07293: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started [0affffe7-6841-7dd6-8fa6-000000000408] 10525 1726773077.07301: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000408 10525 1726773077.07409: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000408 10525 1726773077.07414: WORKER PROCESS EXITING 9733 1726773077.08291: no more pending results, returning what we have 9733 1726773077.08294: results queue empty 9733 1726773077.08294: checking for any_errors_fatal 9733 1726773077.08298: done checking for any_errors_fatal 9733 1726773077.08298: checking for max_fail_percentage 9733 1726773077.08300: done checking for max_fail_percentage 9733 1726773077.08300: checking to see if all hosts have failed and the running result is not ok 9733 1726773077.08301: done checking to see if all hosts have failed 9733 1726773077.08301: getting the remaining hosts for this loop 9733 1726773077.08302: done getting the remaining hosts for this loop 9733 1726773077.08305: getting the next task for host managed_node3 9733 1726773077.08310: done getting next task for host managed_node3 9733 1726773077.08313: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 9733 1726773077.08315: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773077.08324: getting variables 9733 1726773077.08325: in VariableManager get_vars() 9733 1726773077.08353: Calling all_inventory to load vars for managed_node3 9733 1726773077.08355: Calling groups_inventory to load vars for managed_node3 9733 1726773077.08357: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773077.08365: Calling all_plugins_play to load vars for managed_node3 9733 1726773077.08368: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773077.08370: Calling groups_plugins_play to load vars for managed_node3 9733 1726773077.08540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773077.08729: done with get_vars() 9733 1726773077.08738: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74 Thursday 19 September 2024 15:11:17 -0400 (0:00:00.495) 0:00:22.820 **** 9733 1726773077.08823: entering _queue_task() for managed_node3/file 9733 1726773077.09034: worker is 1 (out of 1 available) 9733 1726773077.09050: exiting _queue_task() for managed_node3/file 9733 1726773077.09063: done queuing things up, now waiting for results queue to drain 9733 1726773077.09065: waiting for pending results... 10546 1726773077.09319: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 10546 1726773077.09464: in run() - task 0affffe7-6841-7dd6-8fa6-000000000409 10546 1726773077.09486: variable 'ansible_search_path' from source: unknown 10546 1726773077.09491: variable 'ansible_search_path' from source: unknown 10546 1726773077.09525: calling self._execute() 10546 1726773077.09613: variable 'ansible_host' from source: host vars for 'managed_node3' 10546 1726773077.09622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10546 1726773077.09632: variable 'omit' from source: magic vars 10546 1726773077.09732: variable 'omit' from source: magic vars 10546 1726773077.09783: variable 'omit' from source: magic vars 10546 1726773077.09813: variable '__kernel_settings_profile_dir' from source: role '' all vars 10546 1726773077.10105: variable '__kernel_settings_profile_dir' from source: role '' all vars 10546 1726773077.10255: variable '__kernel_settings_profile_parent' from source: set_fact 10546 1726773077.10264: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10546 1726773077.10314: variable 'omit' from source: magic vars 10546 1726773077.10355: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10546 1726773077.10393: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10546 1726773077.10414: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10546 1726773077.10431: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10546 1726773077.10443: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10546 1726773077.10481: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10546 1726773077.10489: variable 'ansible_host' from source: host vars for 'managed_node3' 10546 1726773077.10493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10546 1726773077.10598: Set connection var ansible_timeout to 10 10546 1726773077.10604: Set connection var ansible_shell_type to sh 10546 1726773077.10610: Set connection var ansible_module_compression to ZIP_DEFLATED 10546 1726773077.10615: Set connection var ansible_shell_executable to /bin/sh 10546 1726773077.10621: Set connection var ansible_pipelining to False 10546 1726773077.10628: Set connection var ansible_connection to ssh 10546 1726773077.10646: variable 'ansible_shell_executable' from source: unknown 10546 1726773077.10650: variable 'ansible_connection' from source: unknown 10546 1726773077.10653: variable 'ansible_module_compression' from source: unknown 10546 1726773077.10655: variable 'ansible_shell_type' from source: unknown 10546 1726773077.10657: variable 'ansible_shell_executable' from source: unknown 10546 1726773077.10659: variable 'ansible_host' from source: host vars for 'managed_node3' 10546 1726773077.10662: variable 'ansible_pipelining' from source: unknown 10546 1726773077.10664: variable 'ansible_timeout' from source: unknown 10546 1726773077.10667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10546 1726773077.10845: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10546 1726773077.10856: variable 'omit' from source: magic vars 10546 1726773077.10862: starting attempt loop 10546 1726773077.10864: running the handler 10546 1726773077.10876: _low_level_execute_command(): starting 10546 1726773077.10882: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10546 1726773077.13681: stdout chunk (state=2): >>>/root <<< 10546 1726773077.13795: stderr chunk (state=3): >>><<< 10546 1726773077.13802: stdout chunk (state=3): >>><<< 10546 1726773077.13820: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10546 1726773077.13833: _low_level_execute_command(): starting 10546 1726773077.13840: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773077.1382768-10546-124786821353420 `" && echo ansible-tmp-1726773077.1382768-10546-124786821353420="` echo /root/.ansible/tmp/ansible-tmp-1726773077.1382768-10546-124786821353420 `" ) && sleep 0' 10546 1726773077.16327: stdout chunk (state=2): >>>ansible-tmp-1726773077.1382768-10546-124786821353420=/root/.ansible/tmp/ansible-tmp-1726773077.1382768-10546-124786821353420 <<< 10546 1726773077.16453: stderr chunk (state=3): >>><<< 10546 1726773077.16460: stdout chunk (state=3): >>><<< 10546 1726773077.16476: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773077.1382768-10546-124786821353420=/root/.ansible/tmp/ansible-tmp-1726773077.1382768-10546-124786821353420 , stderr= 10546 1726773077.16514: variable 'ansible_module_compression' from source: unknown 10546 1726773077.16553: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 10546 1726773077.16587: variable 'ansible_facts' from source: unknown 10546 1726773077.16654: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773077.1382768-10546-124786821353420/AnsiballZ_file.py 10546 1726773077.16757: Sending initial data 10546 1726773077.16764: Sent initial data (152 bytes) 10546 1726773077.19303: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmppj11mx5v /root/.ansible/tmp/ansible-tmp-1726773077.1382768-10546-124786821353420/AnsiballZ_file.py <<< 10546 1726773077.20532: stderr chunk (state=3): >>><<< 10546 1726773077.20543: stdout chunk (state=3): >>><<< 10546 1726773077.20560: done transferring module to remote 10546 1726773077.20570: _low_level_execute_command(): starting 10546 1726773077.20575: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773077.1382768-10546-124786821353420/ /root/.ansible/tmp/ansible-tmp-1726773077.1382768-10546-124786821353420/AnsiballZ_file.py && sleep 0' 10546 1726773077.23006: stderr chunk (state=2): >>><<< 10546 1726773077.23017: stdout chunk (state=2): >>><<< 10546 1726773077.23036: _low_level_execute_command() done: rc=0, stdout=, stderr= 10546 1726773077.23041: _low_level_execute_command(): starting 10546 1726773077.23048: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773077.1382768-10546-124786821353420/AnsiballZ_file.py && sleep 0' 10546 1726773077.39061: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 10546 1726773077.40199: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10546 1726773077.40249: stderr chunk (state=3): >>><<< 10546 1726773077.40255: stdout chunk (state=3): >>><<< 10546 1726773077.40274: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.47.99 closed. 10546 1726773077.40308: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'directory', 'mode': '0755', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773077.1382768-10546-124786821353420/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10546 1726773077.40319: _low_level_execute_command(): starting 10546 1726773077.40325: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773077.1382768-10546-124786821353420/ > /dev/null 2>&1 && sleep 0' 10546 1726773077.42779: stderr chunk (state=2): >>><<< 10546 1726773077.42789: stdout chunk (state=2): >>><<< 10546 1726773077.42805: _low_level_execute_command() done: rc=0, stdout=, stderr= 10546 1726773077.42814: handler run complete 10546 1726773077.42832: attempt loop complete, returning result 10546 1726773077.42836: _execute() done 10546 1726773077.42839: dumping result to json 10546 1726773077.42845: done dumping result, returning 10546 1726773077.42852: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists [0affffe7-6841-7dd6-8fa6-000000000409] 10546 1726773077.42859: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000409 10546 1726773077.42896: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000409 10546 1726773077.42900: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/tuned/kernel_settings", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "state": "directory", "uid": 0 } 9733 1726773077.43047: no more pending results, returning what we have 9733 1726773077.43050: results queue empty 9733 1726773077.43050: checking for any_errors_fatal 9733 1726773077.43065: done checking for any_errors_fatal 9733 1726773077.43066: checking for max_fail_percentage 9733 1726773077.43067: done checking for max_fail_percentage 9733 1726773077.43068: checking to see if all hosts have failed and the running result is not ok 9733 1726773077.43068: done checking to see if all hosts have failed 9733 1726773077.43069: getting the remaining hosts for this loop 9733 1726773077.43070: done getting the remaining hosts for this loop 9733 1726773077.43073: getting the next task for host managed_node3 9733 1726773077.43078: done getting next task for host managed_node3 9733 1726773077.43082: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 9733 1726773077.43084: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773077.43096: getting variables 9733 1726773077.43097: in VariableManager get_vars() 9733 1726773077.43132: Calling all_inventory to load vars for managed_node3 9733 1726773077.43134: Calling groups_inventory to load vars for managed_node3 9733 1726773077.43136: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773077.43146: Calling all_plugins_play to load vars for managed_node3 9733 1726773077.43148: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773077.43151: Calling groups_plugins_play to load vars for managed_node3 9733 1726773077.43264: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773077.43384: done with get_vars() 9733 1726773077.43395: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get active_profile] ********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:80 Thursday 19 September 2024 15:11:17 -0400 (0:00:00.346) 0:00:23.167 **** 9733 1726773077.43460: entering _queue_task() for managed_node3/slurp 9733 1726773077.43627: worker is 1 (out of 1 available) 9733 1726773077.43642: exiting _queue_task() for managed_node3/slurp 9733 1726773077.43654: done queuing things up, now waiting for results queue to drain 9733 1726773077.43656: waiting for pending results... 10561 1726773077.43788: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 10561 1726773077.43898: in run() - task 0affffe7-6841-7dd6-8fa6-00000000040a 10561 1726773077.43915: variable 'ansible_search_path' from source: unknown 10561 1726773077.43918: variable 'ansible_search_path' from source: unknown 10561 1726773077.43945: calling self._execute() 10561 1726773077.44014: variable 'ansible_host' from source: host vars for 'managed_node3' 10561 1726773077.44022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10561 1726773077.44028: variable 'omit' from source: magic vars 10561 1726773077.44111: variable 'omit' from source: magic vars 10561 1726773077.44146: variable 'omit' from source: magic vars 10561 1726773077.44167: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 10561 1726773077.44392: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 10561 1726773077.44451: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10561 1726773077.44483: variable 'omit' from source: magic vars 10561 1726773077.44515: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10561 1726773077.44540: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10561 1726773077.44558: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10561 1726773077.44569: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10561 1726773077.44581: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10561 1726773077.44603: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10561 1726773077.44607: variable 'ansible_host' from source: host vars for 'managed_node3' 10561 1726773077.44609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10561 1726773077.44691: Set connection var ansible_timeout to 10 10561 1726773077.44697: Set connection var ansible_shell_type to sh 10561 1726773077.44702: Set connection var ansible_module_compression to ZIP_DEFLATED 10561 1726773077.44708: Set connection var ansible_shell_executable to /bin/sh 10561 1726773077.44712: Set connection var ansible_pipelining to False 10561 1726773077.44719: Set connection var ansible_connection to ssh 10561 1726773077.44738: variable 'ansible_shell_executable' from source: unknown 10561 1726773077.44744: variable 'ansible_connection' from source: unknown 10561 1726773077.44748: variable 'ansible_module_compression' from source: unknown 10561 1726773077.44751: variable 'ansible_shell_type' from source: unknown 10561 1726773077.44755: variable 'ansible_shell_executable' from source: unknown 10561 1726773077.44758: variable 'ansible_host' from source: host vars for 'managed_node3' 10561 1726773077.44762: variable 'ansible_pipelining' from source: unknown 10561 1726773077.44765: variable 'ansible_timeout' from source: unknown 10561 1726773077.44771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10561 1726773077.44907: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10561 1726773077.44916: variable 'omit' from source: magic vars 10561 1726773077.44920: starting attempt loop 10561 1726773077.44922: running the handler 10561 1726773077.44931: _low_level_execute_command(): starting 10561 1726773077.44937: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10561 1726773077.47296: stdout chunk (state=2): >>>/root <<< 10561 1726773077.47413: stderr chunk (state=3): >>><<< 10561 1726773077.47420: stdout chunk (state=3): >>><<< 10561 1726773077.47437: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10561 1726773077.47450: _low_level_execute_command(): starting 10561 1726773077.47456: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773077.4744482-10561-187285112233982 `" && echo ansible-tmp-1726773077.4744482-10561-187285112233982="` echo /root/.ansible/tmp/ansible-tmp-1726773077.4744482-10561-187285112233982 `" ) && sleep 0' 10561 1726773077.49999: stdout chunk (state=2): >>>ansible-tmp-1726773077.4744482-10561-187285112233982=/root/.ansible/tmp/ansible-tmp-1726773077.4744482-10561-187285112233982 <<< 10561 1726773077.50152: stderr chunk (state=3): >>><<< 10561 1726773077.50160: stdout chunk (state=3): >>><<< 10561 1726773077.50179: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773077.4744482-10561-187285112233982=/root/.ansible/tmp/ansible-tmp-1726773077.4744482-10561-187285112233982 , stderr= 10561 1726773077.50224: variable 'ansible_module_compression' from source: unknown 10561 1726773077.50265: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.slurp-ZIP_DEFLATED 10561 1726773077.50300: variable 'ansible_facts' from source: unknown 10561 1726773077.50406: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773077.4744482-10561-187285112233982/AnsiballZ_slurp.py 10561 1726773077.50773: Sending initial data 10561 1726773077.50780: Sent initial data (153 bytes) 10561 1726773077.53140: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpqizeu2mj /root/.ansible/tmp/ansible-tmp-1726773077.4744482-10561-187285112233982/AnsiballZ_slurp.py <<< 10561 1726773077.54280: stderr chunk (state=3): >>><<< 10561 1726773077.54288: stdout chunk (state=3): >>><<< 10561 1726773077.54306: done transferring module to remote 10561 1726773077.54317: _low_level_execute_command(): starting 10561 1726773077.54323: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773077.4744482-10561-187285112233982/ /root/.ansible/tmp/ansible-tmp-1726773077.4744482-10561-187285112233982/AnsiballZ_slurp.py && sleep 0' 10561 1726773077.56737: stderr chunk (state=2): >>><<< 10561 1726773077.56746: stdout chunk (state=2): >>><<< 10561 1726773077.56760: _low_level_execute_command() done: rc=0, stdout=, stderr= 10561 1726773077.56764: _low_level_execute_command(): starting 10561 1726773077.56769: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773077.4744482-10561-187285112233982/AnsiballZ_slurp.py && sleep 0' 10561 1726773077.71697: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 10561 1726773077.73057: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10561 1726773077.73068: stdout chunk (state=3): >>><<< 10561 1726773077.73081: stderr chunk (state=3): >>><<< 10561 1726773077.73096: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.47.99 closed. 10561 1726773077.73121: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773077.4744482-10561-187285112233982/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10561 1726773077.73134: _low_level_execute_command(): starting 10561 1726773077.73140: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773077.4744482-10561-187285112233982/ > /dev/null 2>&1 && sleep 0' 10561 1726773077.75833: stderr chunk (state=2): >>><<< 10561 1726773077.75843: stdout chunk (state=2): >>><<< 10561 1726773077.75862: _low_level_execute_command() done: rc=0, stdout=, stderr= 10561 1726773077.75869: handler run complete 10561 1726773077.75883: attempt loop complete, returning result 10561 1726773077.75889: _execute() done 10561 1726773077.75893: dumping result to json 10561 1726773077.75897: done dumping result, returning 10561 1726773077.75905: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile [0affffe7-6841-7dd6-8fa6-00000000040a] 10561 1726773077.75911: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000040a 10561 1726773077.75939: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000040a 10561 1726773077.75942: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "encoding": "base64", "source": "/etc/tuned/active_profile" } 9733 1726773077.76079: no more pending results, returning what we have 9733 1726773077.76082: results queue empty 9733 1726773077.76083: checking for any_errors_fatal 9733 1726773077.76094: done checking for any_errors_fatal 9733 1726773077.76094: checking for max_fail_percentage 9733 1726773077.76096: done checking for max_fail_percentage 9733 1726773077.76096: checking to see if all hosts have failed and the running result is not ok 9733 1726773077.76097: done checking to see if all hosts have failed 9733 1726773077.76097: getting the remaining hosts for this loop 9733 1726773077.76098: done getting the remaining hosts for this loop 9733 1726773077.76102: getting the next task for host managed_node3 9733 1726773077.76107: done getting next task for host managed_node3 9733 1726773077.76111: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 9733 1726773077.76113: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773077.76124: getting variables 9733 1726773077.76126: in VariableManager get_vars() 9733 1726773077.76161: Calling all_inventory to load vars for managed_node3 9733 1726773077.76164: Calling groups_inventory to load vars for managed_node3 9733 1726773077.76165: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773077.76177: Calling all_plugins_play to load vars for managed_node3 9733 1726773077.76180: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773077.76182: Calling groups_plugins_play to load vars for managed_node3 9733 1726773077.76301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773077.76433: done with get_vars() 9733 1726773077.76441: done getting variables 9733 1726773077.76488: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set active_profile] ********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:85 Thursday 19 September 2024 15:11:17 -0400 (0:00:00.330) 0:00:23.497 **** 9733 1726773077.76510: entering _queue_task() for managed_node3/set_fact 9733 1726773077.76684: worker is 1 (out of 1 available) 9733 1726773077.76700: exiting _queue_task() for managed_node3/set_fact 9733 1726773077.76715: done queuing things up, now waiting for results queue to drain 9733 1726773077.76717: waiting for pending results... 10576 1726773077.76841: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 10576 1726773077.76960: in run() - task 0affffe7-6841-7dd6-8fa6-00000000040b 10576 1726773077.76977: variable 'ansible_search_path' from source: unknown 10576 1726773077.76980: variable 'ansible_search_path' from source: unknown 10576 1726773077.77021: calling self._execute() 10576 1726773077.77190: variable 'ansible_host' from source: host vars for 'managed_node3' 10576 1726773077.77200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10576 1726773077.77208: variable 'omit' from source: magic vars 10576 1726773077.77304: variable 'omit' from source: magic vars 10576 1726773077.77350: variable 'omit' from source: magic vars 10576 1726773077.77728: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10576 1726773077.77739: variable '__cur_profile' from source: task vars 10576 1726773077.77879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10576 1726773077.79625: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10576 1726773077.79674: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10576 1726773077.79717: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10576 1726773077.79743: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10576 1726773077.79764: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10576 1726773077.79822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10576 1726773077.79843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10576 1726773077.79863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10576 1726773077.79900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10576 1726773077.79913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10576 1726773077.79993: variable '__kernel_settings_tuned_current_profile' from source: set_fact 10576 1726773077.80037: variable 'omit' from source: magic vars 10576 1726773077.80060: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10576 1726773077.80082: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10576 1726773077.80099: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10576 1726773077.80113: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10576 1726773077.80123: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10576 1726773077.80146: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10576 1726773077.80150: variable 'ansible_host' from source: host vars for 'managed_node3' 10576 1726773077.80152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10576 1726773077.80250: Set connection var ansible_timeout to 10 10576 1726773077.80262: Set connection var ansible_shell_type to sh 10576 1726773077.80272: Set connection var ansible_module_compression to ZIP_DEFLATED 10576 1726773077.80278: Set connection var ansible_shell_executable to /bin/sh 10576 1726773077.80283: Set connection var ansible_pipelining to False 10576 1726773077.80291: Set connection var ansible_connection to ssh 10576 1726773077.80311: variable 'ansible_shell_executable' from source: unknown 10576 1726773077.80315: variable 'ansible_connection' from source: unknown 10576 1726773077.80318: variable 'ansible_module_compression' from source: unknown 10576 1726773077.80320: variable 'ansible_shell_type' from source: unknown 10576 1726773077.80322: variable 'ansible_shell_executable' from source: unknown 10576 1726773077.80325: variable 'ansible_host' from source: host vars for 'managed_node3' 10576 1726773077.80328: variable 'ansible_pipelining' from source: unknown 10576 1726773077.80331: variable 'ansible_timeout' from source: unknown 10576 1726773077.80334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10576 1726773077.80421: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10576 1726773077.80433: variable 'omit' from source: magic vars 10576 1726773077.80440: starting attempt loop 10576 1726773077.80443: running the handler 10576 1726773077.80453: handler run complete 10576 1726773077.80461: attempt loop complete, returning result 10576 1726773077.80465: _execute() done 10576 1726773077.80468: dumping result to json 10576 1726773077.80471: done dumping result, returning 10576 1726773077.80477: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile [0affffe7-6841-7dd6-8fa6-00000000040b] 10576 1726773077.80484: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000040b 10576 1726773077.80508: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000040b 10576 1726773077.80511: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__kernel_settings_active_profile": "virtual-guest kernel_settings" }, "changed": false } 9733 1726773077.81067: no more pending results, returning what we have 9733 1726773077.81070: results queue empty 9733 1726773077.81071: checking for any_errors_fatal 9733 1726773077.81075: done checking for any_errors_fatal 9733 1726773077.81076: checking for max_fail_percentage 9733 1726773077.81077: done checking for max_fail_percentage 9733 1726773077.81078: checking to see if all hosts have failed and the running result is not ok 9733 1726773077.81078: done checking to see if all hosts have failed 9733 1726773077.81079: getting the remaining hosts for this loop 9733 1726773077.81080: done getting the remaining hosts for this loop 9733 1726773077.81083: getting the next task for host managed_node3 9733 1726773077.81089: done getting next task for host managed_node3 9733 1726773077.81093: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 9733 1726773077.81095: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773077.81111: getting variables 9733 1726773077.81113: in VariableManager get_vars() 9733 1726773077.81139: Calling all_inventory to load vars for managed_node3 9733 1726773077.81142: Calling groups_inventory to load vars for managed_node3 9733 1726773077.81146: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773077.81156: Calling all_plugins_play to load vars for managed_node3 9733 1726773077.81159: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773077.81161: Calling groups_plugins_play to load vars for managed_node3 9733 1726773077.81319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773077.81518: done with get_vars() 9733 1726773077.81529: done getting variables 9733 1726773077.81593: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 Thursday 19 September 2024 15:11:17 -0400 (0:00:00.051) 0:00:23.548 **** 9733 1726773077.81624: entering _queue_task() for managed_node3/copy 9733 1726773077.81829: worker is 1 (out of 1 available) 9733 1726773077.81845: exiting _queue_task() for managed_node3/copy 9733 1726773077.81859: done queuing things up, now waiting for results queue to drain 9733 1726773077.81862: waiting for pending results... 10580 1726773077.82000: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 10580 1726773077.82120: in run() - task 0affffe7-6841-7dd6-8fa6-00000000040c 10580 1726773077.82137: variable 'ansible_search_path' from source: unknown 10580 1726773077.82141: variable 'ansible_search_path' from source: unknown 10580 1726773077.82168: calling self._execute() 10580 1726773077.82238: variable 'ansible_host' from source: host vars for 'managed_node3' 10580 1726773077.82247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10580 1726773077.82256: variable 'omit' from source: magic vars 10580 1726773077.82335: variable 'omit' from source: magic vars 10580 1726773077.82370: variable 'omit' from source: magic vars 10580 1726773077.82394: variable '__kernel_settings_active_profile' from source: set_fact 10580 1726773077.82617: variable '__kernel_settings_active_profile' from source: set_fact 10580 1726773077.82640: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 10580 1726773077.82695: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 10580 1726773077.82749: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10580 1726773077.82774: variable 'omit' from source: magic vars 10580 1726773077.82808: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10580 1726773077.82834: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10580 1726773077.82853: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10580 1726773077.82868: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10580 1726773077.82880: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10580 1726773077.82905: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10580 1726773077.82911: variable 'ansible_host' from source: host vars for 'managed_node3' 10580 1726773077.82915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10580 1726773077.82987: Set connection var ansible_timeout to 10 10580 1726773077.82992: Set connection var ansible_shell_type to sh 10580 1726773077.82998: Set connection var ansible_module_compression to ZIP_DEFLATED 10580 1726773077.83004: Set connection var ansible_shell_executable to /bin/sh 10580 1726773077.83009: Set connection var ansible_pipelining to False 10580 1726773077.83016: Set connection var ansible_connection to ssh 10580 1726773077.83031: variable 'ansible_shell_executable' from source: unknown 10580 1726773077.83034: variable 'ansible_connection' from source: unknown 10580 1726773077.83038: variable 'ansible_module_compression' from source: unknown 10580 1726773077.83041: variable 'ansible_shell_type' from source: unknown 10580 1726773077.83044: variable 'ansible_shell_executable' from source: unknown 10580 1726773077.83048: variable 'ansible_host' from source: host vars for 'managed_node3' 10580 1726773077.83052: variable 'ansible_pipelining' from source: unknown 10580 1726773077.83055: variable 'ansible_timeout' from source: unknown 10580 1726773077.83060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10580 1726773077.83153: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10580 1726773077.83162: variable 'omit' from source: magic vars 10580 1726773077.83167: starting attempt loop 10580 1726773077.83169: running the handler 10580 1726773077.83181: _low_level_execute_command(): starting 10580 1726773077.83191: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10580 1726773077.85733: stdout chunk (state=2): >>>/root <<< 10580 1726773077.85844: stderr chunk (state=3): >>><<< 10580 1726773077.85851: stdout chunk (state=3): >>><<< 10580 1726773077.85871: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10580 1726773077.85888: _low_level_execute_command(): starting 10580 1726773077.85895: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773077.8588097-10580-130330799064038 `" && echo ansible-tmp-1726773077.8588097-10580-130330799064038="` echo /root/.ansible/tmp/ansible-tmp-1726773077.8588097-10580-130330799064038 `" ) && sleep 0' 10580 1726773077.88659: stdout chunk (state=2): >>>ansible-tmp-1726773077.8588097-10580-130330799064038=/root/.ansible/tmp/ansible-tmp-1726773077.8588097-10580-130330799064038 <<< 10580 1726773077.88793: stderr chunk (state=3): >>><<< 10580 1726773077.88802: stdout chunk (state=3): >>><<< 10580 1726773077.88818: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773077.8588097-10580-130330799064038=/root/.ansible/tmp/ansible-tmp-1726773077.8588097-10580-130330799064038 , stderr= 10580 1726773077.88896: variable 'ansible_module_compression' from source: unknown 10580 1726773077.88939: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10580 1726773077.88969: variable 'ansible_facts' from source: unknown 10580 1726773077.89039: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773077.8588097-10580-130330799064038/AnsiballZ_stat.py 10580 1726773077.89131: Sending initial data 10580 1726773077.89138: Sent initial data (152 bytes) 10580 1726773077.91705: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmprkkqhpwr /root/.ansible/tmp/ansible-tmp-1726773077.8588097-10580-130330799064038/AnsiballZ_stat.py <<< 10580 1726773077.92877: stderr chunk (state=3): >>><<< 10580 1726773077.92887: stdout chunk (state=3): >>><<< 10580 1726773077.92905: done transferring module to remote 10580 1726773077.92916: _low_level_execute_command(): starting 10580 1726773077.92921: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773077.8588097-10580-130330799064038/ /root/.ansible/tmp/ansible-tmp-1726773077.8588097-10580-130330799064038/AnsiballZ_stat.py && sleep 0' 10580 1726773077.95347: stderr chunk (state=2): >>><<< 10580 1726773077.95358: stdout chunk (state=2): >>><<< 10580 1726773077.95375: _low_level_execute_command() done: rc=0, stdout=, stderr= 10580 1726773077.95380: _low_level_execute_command(): starting 10580 1726773077.95386: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773077.8588097-10580-130330799064038/AnsiballZ_stat.py && sleep 0' 10580 1726773078.11692: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 142606531, "dev": 51713, "nlink": 1, "atime": 1726773071.2061307, "mtime": 1726773065.4191086, "ctime": 1726773065.4191086, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "2407425296", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 10580 1726773078.12750: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10580 1726773078.12798: stderr chunk (state=3): >>><<< 10580 1726773078.12805: stdout chunk (state=3): >>><<< 10580 1726773078.12820: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 142606531, "dev": 51713, "nlink": 1, "atime": 1726773071.2061307, "mtime": 1726773065.4191086, "ctime": 1726773065.4191086, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "2407425296", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.47.99 closed. 10580 1726773078.12859: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773077.8588097-10580-130330799064038/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10580 1726773078.12899: variable 'ansible_module_compression' from source: unknown 10580 1726773078.12931: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 10580 1726773078.12951: variable 'ansible_facts' from source: unknown 10580 1726773078.13011: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773077.8588097-10580-130330799064038/AnsiballZ_file.py 10580 1726773078.13097: Sending initial data 10580 1726773078.13104: Sent initial data (152 bytes) 10580 1726773078.15701: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmp0y34o6pu /root/.ansible/tmp/ansible-tmp-1726773077.8588097-10580-130330799064038/AnsiballZ_file.py <<< 10580 1726773078.16922: stderr chunk (state=3): >>><<< 10580 1726773078.16933: stdout chunk (state=3): >>><<< 10580 1726773078.16953: done transferring module to remote 10580 1726773078.16962: _low_level_execute_command(): starting 10580 1726773078.16967: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773077.8588097-10580-130330799064038/ /root/.ansible/tmp/ansible-tmp-1726773077.8588097-10580-130330799064038/AnsiballZ_file.py && sleep 0' 10580 1726773078.19394: stderr chunk (state=2): >>><<< 10580 1726773078.19404: stdout chunk (state=2): >>><<< 10580 1726773078.19419: _low_level_execute_command() done: rc=0, stdout=, stderr= 10580 1726773078.19424: _low_level_execute_command(): starting 10580 1726773078.19429: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773077.8588097-10580-130330799064038/AnsiballZ_file.py && sleep 0' 10580 1726773078.35483: stdout chunk (state=2): >>> {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmp90yohriu", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 10580 1726773078.36602: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10580 1726773078.36644: stderr chunk (state=3): >>><<< 10580 1726773078.36651: stdout chunk (state=3): >>><<< 10580 1726773078.36668: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmp90yohriu", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.47.99 closed. 10580 1726773078.36700: done with _execute_module (ansible.legacy.file, {'mode': '0600', 'dest': '/etc/tuned/active_profile', '_original_basename': 'tmp90yohriu', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773077.8588097-10580-130330799064038/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10580 1726773078.36711: _low_level_execute_command(): starting 10580 1726773078.36717: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773077.8588097-10580-130330799064038/ > /dev/null 2>&1 && sleep 0' 10580 1726773078.39377: stderr chunk (state=2): >>><<< 10580 1726773078.39387: stdout chunk (state=2): >>><<< 10580 1726773078.39403: _low_level_execute_command() done: rc=0, stdout=, stderr= 10580 1726773078.39412: handler run complete 10580 1726773078.39431: attempt loop complete, returning result 10580 1726773078.39435: _execute() done 10580 1726773078.39439: dumping result to json 10580 1726773078.39444: done dumping result, returning 10580 1726773078.39451: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile [0affffe7-6841-7dd6-8fa6-00000000040c] 10580 1726773078.39457: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000040c 10580 1726773078.39496: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000040c 10580 1726773078.39500: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/active_profile", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "state": "file", "uid": 0 } 9733 1726773078.39820: no more pending results, returning what we have 9733 1726773078.39823: results queue empty 9733 1726773078.39823: checking for any_errors_fatal 9733 1726773078.39828: done checking for any_errors_fatal 9733 1726773078.39829: checking for max_fail_percentage 9733 1726773078.39830: done checking for max_fail_percentage 9733 1726773078.39830: checking to see if all hosts have failed and the running result is not ok 9733 1726773078.39831: done checking to see if all hosts have failed 9733 1726773078.39831: getting the remaining hosts for this loop 9733 1726773078.39832: done getting the remaining hosts for this loop 9733 1726773078.39834: getting the next task for host managed_node3 9733 1726773078.39839: done getting next task for host managed_node3 9733 1726773078.39842: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 9733 1726773078.39843: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773078.39851: getting variables 9733 1726773078.39852: in VariableManager get_vars() 9733 1726773078.39878: Calling all_inventory to load vars for managed_node3 9733 1726773078.39880: Calling groups_inventory to load vars for managed_node3 9733 1726773078.39881: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773078.39891: Calling all_plugins_play to load vars for managed_node3 9733 1726773078.39893: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773078.39895: Calling groups_plugins_play to load vars for managed_node3 9733 1726773078.40003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773078.40299: done with get_vars() 9733 1726773078.40306: done getting variables 9733 1726773078.40348: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set profile_mode to manual] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 Thursday 19 September 2024 15:11:18 -0400 (0:00:00.587) 0:00:24.136 **** 9733 1726773078.40368: entering _queue_task() for managed_node3/copy 9733 1726773078.40533: worker is 1 (out of 1 available) 9733 1726773078.40547: exiting _queue_task() for managed_node3/copy 9733 1726773078.40560: done queuing things up, now waiting for results queue to drain 9733 1726773078.40561: waiting for pending results... 10607 1726773078.40695: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 10607 1726773078.40820: in run() - task 0affffe7-6841-7dd6-8fa6-00000000040d 10607 1726773078.40836: variable 'ansible_search_path' from source: unknown 10607 1726773078.40840: variable 'ansible_search_path' from source: unknown 10607 1726773078.40869: calling self._execute() 10607 1726773078.40941: variable 'ansible_host' from source: host vars for 'managed_node3' 10607 1726773078.40950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10607 1726773078.40959: variable 'omit' from source: magic vars 10607 1726773078.41036: variable 'omit' from source: magic vars 10607 1726773078.41075: variable 'omit' from source: magic vars 10607 1726773078.41099: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 10607 1726773078.41323: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 10607 1726773078.41401: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10607 1726773078.41432: variable 'omit' from source: magic vars 10607 1726773078.41478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10607 1726773078.41507: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10607 1726773078.41533: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10607 1726773078.41548: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10607 1726773078.41558: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10607 1726773078.41623: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10607 1726773078.41629: variable 'ansible_host' from source: host vars for 'managed_node3' 10607 1726773078.41632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10607 1726773078.41729: Set connection var ansible_timeout to 10 10607 1726773078.41735: Set connection var ansible_shell_type to sh 10607 1726773078.41740: Set connection var ansible_module_compression to ZIP_DEFLATED 10607 1726773078.41745: Set connection var ansible_shell_executable to /bin/sh 10607 1726773078.41750: Set connection var ansible_pipelining to False 10607 1726773078.41757: Set connection var ansible_connection to ssh 10607 1726773078.41780: variable 'ansible_shell_executable' from source: unknown 10607 1726773078.41784: variable 'ansible_connection' from source: unknown 10607 1726773078.41790: variable 'ansible_module_compression' from source: unknown 10607 1726773078.41792: variable 'ansible_shell_type' from source: unknown 10607 1726773078.41795: variable 'ansible_shell_executable' from source: unknown 10607 1726773078.41797: variable 'ansible_host' from source: host vars for 'managed_node3' 10607 1726773078.41800: variable 'ansible_pipelining' from source: unknown 10607 1726773078.41803: variable 'ansible_timeout' from source: unknown 10607 1726773078.41806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10607 1726773078.41929: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10607 1726773078.41943: variable 'omit' from source: magic vars 10607 1726773078.41951: starting attempt loop 10607 1726773078.41955: running the handler 10607 1726773078.41967: _low_level_execute_command(): starting 10607 1726773078.41975: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10607 1726773078.44506: stdout chunk (state=2): >>>/root <<< 10607 1726773078.44623: stderr chunk (state=3): >>><<< 10607 1726773078.44630: stdout chunk (state=3): >>><<< 10607 1726773078.44648: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10607 1726773078.44660: _low_level_execute_command(): starting 10607 1726773078.44667: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773078.4465559-10607-12733640179040 `" && echo ansible-tmp-1726773078.4465559-10607-12733640179040="` echo /root/.ansible/tmp/ansible-tmp-1726773078.4465559-10607-12733640179040 `" ) && sleep 0' 10607 1726773078.47140: stdout chunk (state=2): >>>ansible-tmp-1726773078.4465559-10607-12733640179040=/root/.ansible/tmp/ansible-tmp-1726773078.4465559-10607-12733640179040 <<< 10607 1726773078.47264: stderr chunk (state=3): >>><<< 10607 1726773078.47272: stdout chunk (state=3): >>><<< 10607 1726773078.47290: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773078.4465559-10607-12733640179040=/root/.ansible/tmp/ansible-tmp-1726773078.4465559-10607-12733640179040 , stderr= 10607 1726773078.47360: variable 'ansible_module_compression' from source: unknown 10607 1726773078.47408: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10607 1726773078.47440: variable 'ansible_facts' from source: unknown 10607 1726773078.47510: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773078.4465559-10607-12733640179040/AnsiballZ_stat.py 10607 1726773078.47602: Sending initial data 10607 1726773078.47610: Sent initial data (151 bytes) 10607 1726773078.50194: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmps3w5atf9 /root/.ansible/tmp/ansible-tmp-1726773078.4465559-10607-12733640179040/AnsiballZ_stat.py <<< 10607 1726773078.51379: stderr chunk (state=3): >>><<< 10607 1726773078.51388: stdout chunk (state=3): >>><<< 10607 1726773078.51406: done transferring module to remote 10607 1726773078.51417: _low_level_execute_command(): starting 10607 1726773078.51422: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773078.4465559-10607-12733640179040/ /root/.ansible/tmp/ansible-tmp-1726773078.4465559-10607-12733640179040/AnsiballZ_stat.py && sleep 0' 10607 1726773078.53828: stderr chunk (state=2): >>><<< 10607 1726773078.53836: stdout chunk (state=2): >>><<< 10607 1726773078.53848: _low_level_execute_command() done: rc=0, stdout=, stderr= 10607 1726773078.53852: _low_level_execute_command(): starting 10607 1726773078.53857: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773078.4465559-10607-12733640179040/AnsiballZ_stat.py && sleep 0' 10607 1726773078.69761: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 148897989, "dev": 51713, "nlink": 1, "atime": 1726773071.5581322, "mtime": 1726773065.4201086, "ctime": 1726773065.4201086, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "4277482174", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 10607 1726773078.70918: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10607 1726773078.70964: stderr chunk (state=3): >>><<< 10607 1726773078.70971: stdout chunk (state=3): >>><<< 10607 1726773078.70991: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 148897989, "dev": 51713, "nlink": 1, "atime": 1726773071.5581322, "mtime": 1726773065.4201086, "ctime": 1726773065.4201086, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "4277482174", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.47.99 closed. 10607 1726773078.71031: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773078.4465559-10607-12733640179040/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10607 1726773078.71069: variable 'ansible_module_compression' from source: unknown 10607 1726773078.71105: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 10607 1726773078.71126: variable 'ansible_facts' from source: unknown 10607 1726773078.71187: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773078.4465559-10607-12733640179040/AnsiballZ_file.py 10607 1726773078.71270: Sending initial data 10607 1726773078.71281: Sent initial data (151 bytes) 10607 1726773078.73990: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpzzkganbj /root/.ansible/tmp/ansible-tmp-1726773078.4465559-10607-12733640179040/AnsiballZ_file.py <<< 10607 1726773078.75670: stderr chunk (state=3): >>><<< 10607 1726773078.75680: stdout chunk (state=3): >>><<< 10607 1726773078.75709: done transferring module to remote 10607 1726773078.75720: _low_level_execute_command(): starting 10607 1726773078.75726: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773078.4465559-10607-12733640179040/ /root/.ansible/tmp/ansible-tmp-1726773078.4465559-10607-12733640179040/AnsiballZ_file.py && sleep 0' 10607 1726773078.78399: stderr chunk (state=2): >>><<< 10607 1726773078.78409: stdout chunk (state=2): >>><<< 10607 1726773078.78430: _low_level_execute_command() done: rc=0, stdout=, stderr= 10607 1726773078.78436: _low_level_execute_command(): starting 10607 1726773078.78441: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773078.4465559-10607-12733640179040/AnsiballZ_file.py && sleep 0' 10607 1726773078.94768: stdout chunk (state=2): >>> {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmpapvadx4i", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 10607 1726773078.95929: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10607 1726773078.95980: stderr chunk (state=3): >>><<< 10607 1726773078.95988: stdout chunk (state=3): >>><<< 10607 1726773078.96004: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmpapvadx4i", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.47.99 closed. 10607 1726773078.96031: done with _execute_module (ansible.legacy.file, {'mode': '0600', 'dest': '/etc/tuned/profile_mode', '_original_basename': 'tmpapvadx4i', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773078.4465559-10607-12733640179040/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10607 1726773078.96042: _low_level_execute_command(): starting 10607 1726773078.96048: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773078.4465559-10607-12733640179040/ > /dev/null 2>&1 && sleep 0' 10607 1726773078.98507: stderr chunk (state=2): >>><<< 10607 1726773078.98515: stdout chunk (state=2): >>><<< 10607 1726773078.98531: _low_level_execute_command() done: rc=0, stdout=, stderr= 10607 1726773078.98539: handler run complete 10607 1726773078.98559: attempt loop complete, returning result 10607 1726773078.98563: _execute() done 10607 1726773078.98566: dumping result to json 10607 1726773078.98571: done dumping result, returning 10607 1726773078.98582: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual [0affffe7-6841-7dd6-8fa6-00000000040d] 10607 1726773078.98589: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000040d 10607 1726773078.98624: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000040d 10607 1726773078.98627: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/profile_mode", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "state": "file", "uid": 0 } 9733 1726773078.98788: no more pending results, returning what we have 9733 1726773078.98791: results queue empty 9733 1726773078.98792: checking for any_errors_fatal 9733 1726773078.98800: done checking for any_errors_fatal 9733 1726773078.98800: checking for max_fail_percentage 9733 1726773078.98802: done checking for max_fail_percentage 9733 1726773078.98802: checking to see if all hosts have failed and the running result is not ok 9733 1726773078.98803: done checking to see if all hosts have failed 9733 1726773078.98803: getting the remaining hosts for this loop 9733 1726773078.98804: done getting the remaining hosts for this loop 9733 1726773078.98809: getting the next task for host managed_node3 9733 1726773078.98814: done getting next task for host managed_node3 9733 1726773078.98817: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config 9733 1726773078.98819: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773078.98830: getting variables 9733 1726773078.98831: in VariableManager get_vars() 9733 1726773078.98863: Calling all_inventory to load vars for managed_node3 9733 1726773078.98865: Calling groups_inventory to load vars for managed_node3 9733 1726773078.98867: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773078.98877: Calling all_plugins_play to load vars for managed_node3 9733 1726773078.98879: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773078.98882: Calling groups_plugins_play to load vars for managed_node3 9733 1726773078.98994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773078.99116: done with get_vars() 9733 1726773078.99125: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get current config] ********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:107 Thursday 19 September 2024 15:11:18 -0400 (0:00:00.588) 0:00:24.724 **** 9733 1726773078.99188: entering _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config 9733 1726773078.99350: worker is 1 (out of 1 available) 9733 1726773078.99365: exiting _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config 9733 1726773078.99376: done queuing things up, now waiting for results queue to drain 9733 1726773078.99378: waiting for pending results... 10629 1726773078.99514: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get current config 10629 1726773078.99636: in run() - task 0affffe7-6841-7dd6-8fa6-00000000040e 10629 1726773078.99652: variable 'ansible_search_path' from source: unknown 10629 1726773078.99656: variable 'ansible_search_path' from source: unknown 10629 1726773078.99687: calling self._execute() 10629 1726773078.99756: variable 'ansible_host' from source: host vars for 'managed_node3' 10629 1726773078.99765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10629 1726773078.99776: variable 'omit' from source: magic vars 10629 1726773078.99849: variable 'omit' from source: magic vars 10629 1726773078.99889: variable 'omit' from source: magic vars 10629 1726773078.99912: variable '__kernel_settings_profile_filename' from source: role '' all vars 10629 1726773079.00138: variable '__kernel_settings_profile_filename' from source: role '' all vars 10629 1726773079.00201: variable '__kernel_settings_profile_dir' from source: role '' all vars 10629 1726773079.00267: variable '__kernel_settings_profile_parent' from source: set_fact 10629 1726773079.00278: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10629 1726773079.00370: variable 'omit' from source: magic vars 10629 1726773079.00406: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10629 1726773079.00431: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10629 1726773079.00450: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10629 1726773079.00463: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10629 1726773079.00478: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10629 1726773079.00504: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10629 1726773079.00509: variable 'ansible_host' from source: host vars for 'managed_node3' 10629 1726773079.00513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10629 1726773079.00582: Set connection var ansible_timeout to 10 10629 1726773079.00590: Set connection var ansible_shell_type to sh 10629 1726773079.00597: Set connection var ansible_module_compression to ZIP_DEFLATED 10629 1726773079.00602: Set connection var ansible_shell_executable to /bin/sh 10629 1726773079.00608: Set connection var ansible_pipelining to False 10629 1726773079.00614: Set connection var ansible_connection to ssh 10629 1726773079.00636: variable 'ansible_shell_executable' from source: unknown 10629 1726773079.00640: variable 'ansible_connection' from source: unknown 10629 1726773079.00643: variable 'ansible_module_compression' from source: unknown 10629 1726773079.00647: variable 'ansible_shell_type' from source: unknown 10629 1726773079.00650: variable 'ansible_shell_executable' from source: unknown 10629 1726773079.00654: variable 'ansible_host' from source: host vars for 'managed_node3' 10629 1726773079.00658: variable 'ansible_pipelining' from source: unknown 10629 1726773079.00661: variable 'ansible_timeout' from source: unknown 10629 1726773079.00665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10629 1726773079.00798: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10629 1726773079.00809: variable 'omit' from source: magic vars 10629 1726773079.00816: starting attempt loop 10629 1726773079.00819: running the handler 10629 1726773079.00830: _low_level_execute_command(): starting 10629 1726773079.00838: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10629 1726773079.03233: stdout chunk (state=2): >>>/root <<< 10629 1726773079.03338: stderr chunk (state=3): >>><<< 10629 1726773079.03345: stdout chunk (state=3): >>><<< 10629 1726773079.03364: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10629 1726773079.03379: _low_level_execute_command(): starting 10629 1726773079.03387: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773079.0337162-10629-192101546021569 `" && echo ansible-tmp-1726773079.0337162-10629-192101546021569="` echo /root/.ansible/tmp/ansible-tmp-1726773079.0337162-10629-192101546021569 `" ) && sleep 0' 10629 1726773079.05926: stdout chunk (state=2): >>>ansible-tmp-1726773079.0337162-10629-192101546021569=/root/.ansible/tmp/ansible-tmp-1726773079.0337162-10629-192101546021569 <<< 10629 1726773079.06054: stderr chunk (state=3): >>><<< 10629 1726773079.06062: stdout chunk (state=3): >>><<< 10629 1726773079.06078: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773079.0337162-10629-192101546021569=/root/.ansible/tmp/ansible-tmp-1726773079.0337162-10629-192101546021569 , stderr= 10629 1726773079.06116: variable 'ansible_module_compression' from source: unknown 10629 1726773079.06148: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 10629 1726773079.06183: variable 'ansible_facts' from source: unknown 10629 1726773079.06248: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773079.0337162-10629-192101546021569/AnsiballZ_kernel_settings_get_config.py 10629 1726773079.06345: Sending initial data 10629 1726773079.06354: Sent initial data (174 bytes) 10629 1726773079.08989: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmppc6ashpk /root/.ansible/tmp/ansible-tmp-1726773079.0337162-10629-192101546021569/AnsiballZ_kernel_settings_get_config.py <<< 10629 1726773079.10160: stderr chunk (state=3): >>><<< 10629 1726773079.10174: stdout chunk (state=3): >>><<< 10629 1726773079.10196: done transferring module to remote 10629 1726773079.10206: _low_level_execute_command(): starting 10629 1726773079.10212: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773079.0337162-10629-192101546021569/ /root/.ansible/tmp/ansible-tmp-1726773079.0337162-10629-192101546021569/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10629 1726773079.13067: stderr chunk (state=2): >>><<< 10629 1726773079.13081: stdout chunk (state=2): >>><<< 10629 1726773079.13098: _low_level_execute_command() done: rc=0, stdout=, stderr= 10629 1726773079.13102: _low_level_execute_command(): starting 10629 1726773079.13107: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773079.0337162-10629-192101546021569/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10629 1726773079.28894: stdout chunk (state=2): >>> {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "fs.file-max": "379724"}, "sysfs": {"/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0"}, "vm": {"transparent_hugepages": "madvise"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} <<< 10629 1726773079.29983: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10629 1726773079.30031: stderr chunk (state=3): >>><<< 10629 1726773079.30038: stdout chunk (state=3): >>><<< 10629 1726773079.30055: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "fs.file-max": "379724"}, "sysfs": {"/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0"}, "vm": {"transparent_hugepages": "madvise"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} , stderr=Shared connection to 10.31.47.99 closed. 10629 1726773079.30084: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773079.0337162-10629-192101546021569/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10629 1726773079.30097: _low_level_execute_command(): starting 10629 1726773079.30103: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773079.0337162-10629-192101546021569/ > /dev/null 2>&1 && sleep 0' 10629 1726773079.32607: stderr chunk (state=2): >>><<< 10629 1726773079.32616: stdout chunk (state=2): >>><<< 10629 1726773079.32630: _low_level_execute_command() done: rc=0, stdout=, stderr= 10629 1726773079.32637: handler run complete 10629 1726773079.32651: attempt loop complete, returning result 10629 1726773079.32655: _execute() done 10629 1726773079.32658: dumping result to json 10629 1726773079.32663: done dumping result, returning 10629 1726773079.32671: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get current config [0affffe7-6841-7dd6-8fa6-00000000040e] 10629 1726773079.32678: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000040e 10629 1726773079.32710: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000040e 10629 1726773079.32713: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "data": { "main": { "summary": "kernel settings" }, "sysctl": { "fs.epoll.max_user_watches": "785592", "fs.file-max": "379724" }, "sysfs": { "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0" }, "vm": { "transparent_hugepages": "madvise" } } } 9733 1726773079.32867: no more pending results, returning what we have 9733 1726773079.32870: results queue empty 9733 1726773079.32871: checking for any_errors_fatal 9733 1726773079.32879: done checking for any_errors_fatal 9733 1726773079.32880: checking for max_fail_percentage 9733 1726773079.32881: done checking for max_fail_percentage 9733 1726773079.32882: checking to see if all hosts have failed and the running result is not ok 9733 1726773079.32882: done checking to see if all hosts have failed 9733 1726773079.32883: getting the remaining hosts for this loop 9733 1726773079.32884: done getting the remaining hosts for this loop 9733 1726773079.32889: getting the next task for host managed_node3 9733 1726773079.32894: done getting next task for host managed_node3 9733 1726773079.32898: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 9733 1726773079.32900: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773079.32911: getting variables 9733 1726773079.32912: in VariableManager get_vars() 9733 1726773079.32946: Calling all_inventory to load vars for managed_node3 9733 1726773079.32948: Calling groups_inventory to load vars for managed_node3 9733 1726773079.32950: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773079.32960: Calling all_plugins_play to load vars for managed_node3 9733 1726773079.32962: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773079.32964: Calling groups_plugins_play to load vars for managed_node3 9733 1726773079.33124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773079.33243: done with get_vars() 9733 1726773079.33251: done getting variables 9733 1726773079.33300: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Apply kernel settings] ******* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 Thursday 19 September 2024 15:11:19 -0400 (0:00:00.341) 0:00:25.065 **** 9733 1726773079.33322: entering _queue_task() for managed_node3/template 9733 1726773079.33490: worker is 1 (out of 1 available) 9733 1726773079.33507: exiting _queue_task() for managed_node3/template 9733 1726773079.33519: done queuing things up, now waiting for results queue to drain 9733 1726773079.33521: waiting for pending results... 10648 1726773079.33646: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 10648 1726773079.33765: in run() - task 0affffe7-6841-7dd6-8fa6-00000000040f 10648 1726773079.33782: variable 'ansible_search_path' from source: unknown 10648 1726773079.33788: variable 'ansible_search_path' from source: unknown 10648 1726773079.33818: calling self._execute() 10648 1726773079.33889: variable 'ansible_host' from source: host vars for 'managed_node3' 10648 1726773079.33899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10648 1726773079.33908: variable 'omit' from source: magic vars 10648 1726773079.33981: variable 'omit' from source: magic vars 10648 1726773079.34019: variable 'omit' from source: magic vars 10648 1726773079.34255: variable '__kernel_settings_profile_src' from source: role '' all vars 10648 1726773079.34267: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10648 1726773079.34329: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10648 1726773079.34351: variable '__kernel_settings_profile_filename' from source: role '' all vars 10648 1726773079.34400: variable '__kernel_settings_profile_filename' from source: role '' all vars 10648 1726773079.34448: variable '__kernel_settings_profile_dir' from source: role '' all vars 10648 1726773079.34539: variable '__kernel_settings_profile_parent' from source: set_fact 10648 1726773079.34548: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10648 1726773079.34576: variable 'omit' from source: magic vars 10648 1726773079.34616: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10648 1726773079.34648: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10648 1726773079.34669: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10648 1726773079.34687: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10648 1726773079.34700: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10648 1726773079.34727: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10648 1726773079.34732: variable 'ansible_host' from source: host vars for 'managed_node3' 10648 1726773079.34736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10648 1726773079.34826: Set connection var ansible_timeout to 10 10648 1726773079.34831: Set connection var ansible_shell_type to sh 10648 1726773079.34836: Set connection var ansible_module_compression to ZIP_DEFLATED 10648 1726773079.34840: Set connection var ansible_shell_executable to /bin/sh 10648 1726773079.34845: Set connection var ansible_pipelining to False 10648 1726773079.34850: Set connection var ansible_connection to ssh 10648 1726773079.34868: variable 'ansible_shell_executable' from source: unknown 10648 1726773079.34872: variable 'ansible_connection' from source: unknown 10648 1726773079.34875: variable 'ansible_module_compression' from source: unknown 10648 1726773079.34877: variable 'ansible_shell_type' from source: unknown 10648 1726773079.34880: variable 'ansible_shell_executable' from source: unknown 10648 1726773079.34882: variable 'ansible_host' from source: host vars for 'managed_node3' 10648 1726773079.34887: variable 'ansible_pipelining' from source: unknown 10648 1726773079.34890: variable 'ansible_timeout' from source: unknown 10648 1726773079.34893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10648 1726773079.35006: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10648 1726773079.35019: variable 'omit' from source: magic vars 10648 1726773079.35025: starting attempt loop 10648 1726773079.35029: running the handler 10648 1726773079.35041: _low_level_execute_command(): starting 10648 1726773079.35049: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10648 1726773079.37706: stdout chunk (state=2): >>>/root <<< 10648 1726773079.37810: stderr chunk (state=3): >>><<< 10648 1726773079.37818: stdout chunk (state=3): >>><<< 10648 1726773079.37839: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10648 1726773079.37852: _low_level_execute_command(): starting 10648 1726773079.37859: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773079.3784676-10648-138722795723966 `" && echo ansible-tmp-1726773079.3784676-10648-138722795723966="` echo /root/.ansible/tmp/ansible-tmp-1726773079.3784676-10648-138722795723966 `" ) && sleep 0' 10648 1726773079.40591: stdout chunk (state=2): >>>ansible-tmp-1726773079.3784676-10648-138722795723966=/root/.ansible/tmp/ansible-tmp-1726773079.3784676-10648-138722795723966 <<< 10648 1726773079.40890: stderr chunk (state=3): >>><<< 10648 1726773079.40898: stdout chunk (state=3): >>><<< 10648 1726773079.40915: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773079.3784676-10648-138722795723966=/root/.ansible/tmp/ansible-tmp-1726773079.3784676-10648-138722795723966 , stderr= 10648 1726773079.40933: evaluation_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks 10648 1726773079.40955: search_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/templates/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/templates/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/templates/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/kernel_settings.j2 10648 1726773079.40978: variable 'ansible_search_path' from source: unknown 10648 1726773079.41834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10648 1726773079.43761: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10648 1726773079.43812: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10648 1726773079.43842: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10648 1726773079.43869: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10648 1726773079.43902: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10648 1726773079.44093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10648 1726773079.44115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10648 1726773079.44136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10648 1726773079.44166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10648 1726773079.44180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10648 1726773079.44400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10648 1726773079.44419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10648 1726773079.44436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10648 1726773079.44461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10648 1726773079.44472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10648 1726773079.44724: variable 'ansible_managed' from source: unknown 10648 1726773079.44732: variable '__sections' from source: task vars 10648 1726773079.44823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10648 1726773079.44842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10648 1726773079.44858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10648 1726773079.44884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10648 1726773079.44902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10648 1726773079.44986: variable 'kernel_settings_sysctl' from source: include_vars 10648 1726773079.45000: variable '__kernel_settings_state_empty' from source: role '' all vars 10648 1726773079.45006: variable '__kernel_settings_previous_replaced' from source: role '' all vars 10648 1726773079.45060: variable '__sysctl_old' from source: task vars 10648 1726773079.45112: variable '__sysctl_old' from source: task vars 10648 1726773079.45303: variable 'kernel_settings_purge' from source: role '' defaults 10648 1726773079.45311: variable 'kernel_settings_sysctl' from source: include_vars 10648 1726773079.45320: variable '__kernel_settings_state_empty' from source: role '' all vars 10648 1726773079.45324: variable '__kernel_settings_previous_replaced' from source: role '' all vars 10648 1726773079.45329: variable '__kernel_settings_profile_contents' from source: set_fact 10648 1726773079.45534: variable 'kernel_settings_sysfs' from source: include_vars 10648 1726773079.45547: variable '__kernel_settings_state_empty' from source: role '' all vars 10648 1726773079.45552: variable '__kernel_settings_previous_replaced' from source: role '' all vars 10648 1726773079.45607: variable '__sysfs_old' from source: task vars 10648 1726773079.45664: variable '__sysfs_old' from source: task vars 10648 1726773079.45867: variable 'kernel_settings_purge' from source: role '' defaults 10648 1726773079.45873: variable 'kernel_settings_sysfs' from source: include_vars 10648 1726773079.45880: variable '__kernel_settings_state_empty' from source: role '' all vars 10648 1726773079.45883: variable '__kernel_settings_previous_replaced' from source: role '' all vars 10648 1726773079.45889: variable '__kernel_settings_profile_contents' from source: set_fact 10648 1726773079.45931: variable 'kernel_settings_systemd_cpu_affinity' from source: include_vars 10648 1726773079.45974: variable 'kernel_settings_systemd_cpu_affinity' from source: include_vars 10648 1726773079.45989: variable '__systemd_old' from source: task vars 10648 1726773079.46028: variable '__systemd_old' from source: task vars 10648 1726773079.46167: variable 'kernel_settings_purge' from source: role '' defaults 10648 1726773079.46174: variable 'kernel_settings_systemd_cpu_affinity' from source: include_vars 10648 1726773079.46180: variable '__kernel_settings_state_absent' from source: role '' all vars 10648 1726773079.46187: variable '__kernel_settings_profile_contents' from source: set_fact 10648 1726773079.46199: variable 'kernel_settings_transparent_hugepages' from source: include_vars 10648 1726773079.46239: variable 'kernel_settings_transparent_hugepages' from source: include_vars 10648 1726773079.46248: variable 'kernel_settings_transparent_hugepages_defrag' from source: include_vars 10648 1726773079.46293: variable 'kernel_settings_transparent_hugepages_defrag' from source: include_vars 10648 1726773079.46307: variable '__trans_huge_old' from source: task vars 10648 1726773079.46348: variable '__trans_huge_old' from source: task vars 10648 1726773079.46476: variable 'kernel_settings_purge' from source: role '' defaults 10648 1726773079.46483: variable 'kernel_settings_transparent_hugepages' from source: include_vars 10648 1726773079.46490: variable '__kernel_settings_state_absent' from source: role '' all vars 10648 1726773079.46495: variable '__kernel_settings_profile_contents' from source: set_fact 10648 1726773079.46506: variable '__trans_defrag_old' from source: task vars 10648 1726773079.46546: variable '__trans_defrag_old' from source: task vars 10648 1726773079.46674: variable 'kernel_settings_purge' from source: role '' defaults 10648 1726773079.46681: variable 'kernel_settings_transparent_hugepages_defrag' from source: include_vars 10648 1726773079.46689: variable '__kernel_settings_state_absent' from source: role '' all vars 10648 1726773079.46694: variable '__kernel_settings_profile_contents' from source: set_fact 10648 1726773079.46708: variable '__kernel_settings_state_absent' from source: role '' all vars 10648 1726773079.46718: variable '__kernel_settings_state_absent' from source: role '' all vars 10648 1726773079.46723: variable '__kernel_settings_state_absent' from source: role '' all vars 10648 1726773079.46729: variable '__kernel_settings_state_absent' from source: role '' all vars 10648 1726773079.46734: variable '__kernel_settings_state_absent' from source: role '' all vars 10648 1726773079.46737: variable '__kernel_settings_state_absent' from source: role '' all vars 10648 1726773079.46741: variable '__kernel_settings_state_absent' from source: role '' all vars 10648 1726773079.46748: variable '__kernel_settings_state_absent' from source: role '' all vars 10648 1726773079.46753: variable '__kernel_settings_state_absent' from source: role '' all vars 10648 1726773079.46757: variable '__kernel_settings_state_absent' from source: role '' all vars 10648 1726773079.46767: variable '__kernel_settings_state_absent' from source: role '' all vars 10648 1726773079.46773: variable '__kernel_settings_state_absent' from source: role '' all vars 10648 1726773079.46778: variable '__kernel_settings_state_absent' from source: role '' all vars 10648 1726773079.46782: variable '__kernel_settings_state_absent' from source: role '' all vars 10648 1726773079.46789: variable '__kernel_settings_state_absent' from source: role '' all vars 10648 1726773079.46796: variable '__kernel_settings_state_absent' from source: role '' all vars 10648 1726773079.46802: variable '__kernel_settings_state_absent' from source: role '' all vars 10648 1726773079.46809: variable '__kernel_settings_state_absent' from source: role '' all vars 10648 1726773079.46815: variable '__kernel_settings_state_absent' from source: role '' all vars 10648 1726773079.46820: variable '__kernel_settings_state_absent' from source: role '' all vars 10648 1726773079.47255: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10648 1726773079.47301: variable 'ansible_module_compression' from source: unknown 10648 1726773079.47339: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10648 1726773079.47360: variable 'ansible_facts' from source: unknown 10648 1726773079.47422: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773079.3784676-10648-138722795723966/AnsiballZ_stat.py 10648 1726773079.47510: Sending initial data 10648 1726773079.47516: Sent initial data (152 bytes) 10648 1726773079.50209: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpz_bu_6pb /root/.ansible/tmp/ansible-tmp-1726773079.3784676-10648-138722795723966/AnsiballZ_stat.py <<< 10648 1726773079.51445: stderr chunk (state=3): >>><<< 10648 1726773079.51454: stdout chunk (state=3): >>><<< 10648 1726773079.51473: done transferring module to remote 10648 1726773079.51486: _low_level_execute_command(): starting 10648 1726773079.51493: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773079.3784676-10648-138722795723966/ /root/.ansible/tmp/ansible-tmp-1726773079.3784676-10648-138722795723966/AnsiballZ_stat.py && sleep 0' 10648 1726773079.54137: stderr chunk (state=2): >>><<< 10648 1726773079.54148: stdout chunk (state=2): >>><<< 10648 1726773079.54162: _low_level_execute_command() done: rc=0, stdout=, stderr= 10648 1726773079.54167: _low_level_execute_command(): starting 10648 1726773079.54172: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773079.3784676-10648-138722795723966/AnsiballZ_stat.py && sleep 0' 10648 1726773079.70521: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 312, "inode": 159383749, "dev": 51713, "nlink": 1, "atime": 1726773065.3961084, "mtime": 1726773064.4551048, "ctime": 1726773064.7141058, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "ba15904bb90578344fad097ce2f46f9231275eae", "mimetype": "text/plain", "charset": "us-ascii", "version": "1834219966", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 10648 1726773079.71573: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10648 1726773079.71623: stderr chunk (state=3): >>><<< 10648 1726773079.71636: stdout chunk (state=3): >>><<< 10648 1726773079.71648: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 312, "inode": 159383749, "dev": 51713, "nlink": 1, "atime": 1726773065.3961084, "mtime": 1726773064.4551048, "ctime": 1726773064.7141058, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "ba15904bb90578344fad097ce2f46f9231275eae", "mimetype": "text/plain", "charset": "us-ascii", "version": "1834219966", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.47.99 closed. 10648 1726773079.71688: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/kernel_settings/tuned.conf', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773079.3784676-10648-138722795723966/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10648 1726773079.71721: variable 'ansible_module_compression' from source: unknown 10648 1726773079.71753: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 10648 1726773079.71771: variable 'ansible_facts' from source: unknown 10648 1726773079.71825: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773079.3784676-10648-138722795723966/AnsiballZ_file.py 10648 1726773079.71915: Sending initial data 10648 1726773079.71922: Sent initial data (152 bytes) 10648 1726773079.74517: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmp7vygryg0 /root/.ansible/tmp/ansible-tmp-1726773079.3784676-10648-138722795723966/AnsiballZ_file.py <<< 10648 1726773079.75730: stderr chunk (state=3): >>><<< 10648 1726773079.75738: stdout chunk (state=3): >>><<< 10648 1726773079.75756: done transferring module to remote 10648 1726773079.75764: _low_level_execute_command(): starting 10648 1726773079.75769: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773079.3784676-10648-138722795723966/ /root/.ansible/tmp/ansible-tmp-1726773079.3784676-10648-138722795723966/AnsiballZ_file.py && sleep 0' 10648 1726773079.78166: stderr chunk (state=2): >>><<< 10648 1726773079.78174: stdout chunk (state=2): >>><<< 10648 1726773079.78190: _low_level_execute_command() done: rc=0, stdout=, stderr= 10648 1726773079.78194: _low_level_execute_command(): starting 10648 1726773079.78197: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773079.3784676-10648-138722795723966/AnsiballZ_file.py && sleep 0' 10648 1726773079.94367: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings/tuned.conf", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings/tuned.conf"}, "after": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 312, "invocation": {"module_args": {"mode": "0644", "dest": "/etc/tuned/kernel_settings/tuned.conf", "_original_basename": "kernel_settings.j2", "recurse": false, "state": "file", "path": "/etc/tuned/kernel_settings/tuned.conf", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 10648 1726773079.95520: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10648 1726773079.95572: stderr chunk (state=3): >>><<< 10648 1726773079.95579: stdout chunk (state=3): >>><<< 10648 1726773079.95597: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings/tuned.conf", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings/tuned.conf"}, "after": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 312, "invocation": {"module_args": {"mode": "0644", "dest": "/etc/tuned/kernel_settings/tuned.conf", "_original_basename": "kernel_settings.j2", "recurse": false, "state": "file", "path": "/etc/tuned/kernel_settings/tuned.conf", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.47.99 closed. 10648 1726773079.95624: done with _execute_module (ansible.legacy.file, {'mode': '0644', 'dest': '/etc/tuned/kernel_settings/tuned.conf', '_original_basename': 'kernel_settings.j2', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773079.3784676-10648-138722795723966/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10648 1726773079.95654: _low_level_execute_command(): starting 10648 1726773079.95662: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773079.3784676-10648-138722795723966/ > /dev/null 2>&1 && sleep 0' 10648 1726773079.98157: stderr chunk (state=2): >>><<< 10648 1726773079.98166: stdout chunk (state=2): >>><<< 10648 1726773079.98183: _low_level_execute_command() done: rc=0, stdout=, stderr= 10648 1726773079.98196: handler run complete 10648 1726773079.98216: attempt loop complete, returning result 10648 1726773079.98219: _execute() done 10648 1726773079.98223: dumping result to json 10648 1726773079.98229: done dumping result, returning 10648 1726773079.98236: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings [0affffe7-6841-7dd6-8fa6-00000000040f] 10648 1726773079.98242: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000040f 10648 1726773079.98286: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000040f 10648 1726773079.98290: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "checksum": "ba15904bb90578344fad097ce2f46f9231275eae", "dest": "/etc/tuned/kernel_settings/tuned.conf", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "path": "/etc/tuned/kernel_settings/tuned.conf", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 312, "state": "file", "uid": 0 } 9733 1726773079.98474: no more pending results, returning what we have 9733 1726773079.98477: results queue empty 9733 1726773079.98478: checking for any_errors_fatal 9733 1726773079.98490: done checking for any_errors_fatal 9733 1726773079.98491: checking for max_fail_percentage 9733 1726773079.98492: done checking for max_fail_percentage 9733 1726773079.98493: checking to see if all hosts have failed and the running result is not ok 9733 1726773079.98493: done checking to see if all hosts have failed 9733 1726773079.98494: getting the remaining hosts for this loop 9733 1726773079.98495: done getting the remaining hosts for this loop 9733 1726773079.98498: getting the next task for host managed_node3 9733 1726773079.98503: done getting next task for host managed_node3 9733 1726773079.98507: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 9733 1726773079.98509: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773079.98519: getting variables 9733 1726773079.98520: in VariableManager get_vars() 9733 1726773079.98553: Calling all_inventory to load vars for managed_node3 9733 1726773079.98556: Calling groups_inventory to load vars for managed_node3 9733 1726773079.98557: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773079.98567: Calling all_plugins_play to load vars for managed_node3 9733 1726773079.98569: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773079.98571: Calling groups_plugins_play to load vars for managed_node3 9733 1726773079.98687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773079.98809: done with get_vars() 9733 1726773079.98818: done getting variables 9733 1726773079.98859: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 Thursday 19 September 2024 15:11:19 -0400 (0:00:00.655) 0:00:25.721 **** 9733 1726773079.98882: entering _queue_task() for managed_node3/service 9733 1726773079.99054: worker is 1 (out of 1 available) 9733 1726773079.99072: exiting _queue_task() for managed_node3/service 9733 1726773079.99089: done queuing things up, now waiting for results queue to drain 9733 1726773079.99091: waiting for pending results... 10668 1726773079.99213: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 10668 1726773079.99327: in run() - task 0affffe7-6841-7dd6-8fa6-000000000410 10668 1726773079.99344: variable 'ansible_search_path' from source: unknown 10668 1726773079.99347: variable 'ansible_search_path' from source: unknown 10668 1726773079.99382: variable '__kernel_settings_services' from source: include_vars 10668 1726773079.99626: variable '__kernel_settings_services' from source: include_vars 10668 1726773079.99757: variable 'omit' from source: magic vars 10668 1726773079.99826: variable 'ansible_host' from source: host vars for 'managed_node3' 10668 1726773079.99837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10668 1726773079.99846: variable 'omit' from source: magic vars 10668 1726773080.00021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10668 1726773080.00193: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10668 1726773080.00226: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10668 1726773080.00251: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10668 1726773080.00277: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10668 1726773080.00351: variable '__kernel_settings_register_profile' from source: set_fact 10668 1726773080.00363: variable '__kernel_settings_register_mode' from source: set_fact 10668 1726773080.00380: Evaluated conditional (__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed): False 10668 1726773080.00387: when evaluation is False, skipping this task 10668 1726773080.00407: variable 'item' from source: unknown 10668 1726773080.00451: variable 'item' from source: unknown skipping: [managed_node3] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed", "item": "tuned", "skip_reason": "Conditional result was False" } 10668 1726773080.00474: dumping result to json 10668 1726773080.00478: done dumping result, returning 10668 1726773080.00482: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes [0affffe7-6841-7dd6-8fa6-000000000410] 10668 1726773080.00489: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000410 10668 1726773080.00505: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000410 10668 1726773080.00507: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false } MSG: All items skipped 9733 1726773080.00867: no more pending results, returning what we have 9733 1726773080.00869: results queue empty 9733 1726773080.00869: checking for any_errors_fatal 9733 1726773080.00878: done checking for any_errors_fatal 9733 1726773080.00878: checking for max_fail_percentage 9733 1726773080.00881: done checking for max_fail_percentage 9733 1726773080.00881: checking to see if all hosts have failed and the running result is not ok 9733 1726773080.00882: done checking to see if all hosts have failed 9733 1726773080.00882: getting the remaining hosts for this loop 9733 1726773080.00883: done getting the remaining hosts for this loop 9733 1726773080.00887: getting the next task for host managed_node3 9733 1726773080.00891: done getting next task for host managed_node3 9733 1726773080.00894: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 9733 1726773080.00895: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773080.00905: getting variables 9733 1726773080.00906: in VariableManager get_vars() 9733 1726773080.00931: Calling all_inventory to load vars for managed_node3 9733 1726773080.00933: Calling groups_inventory to load vars for managed_node3 9733 1726773080.00934: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773080.00941: Calling all_plugins_play to load vars for managed_node3 9733 1726773080.00943: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773080.00944: Calling groups_plugins_play to load vars for managed_node3 9733 1726773080.01048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773080.01169: done with get_vars() 9733 1726773080.01177: done getting variables 9733 1726773080.01220: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Tuned apply settings] ******** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 Thursday 19 September 2024 15:11:20 -0400 (0:00:00.023) 0:00:25.744 **** 9733 1726773080.01240: entering _queue_task() for managed_node3/command 9733 1726773080.01397: worker is 1 (out of 1 available) 9733 1726773080.01412: exiting _queue_task() for managed_node3/command 9733 1726773080.01424: done queuing things up, now waiting for results queue to drain 9733 1726773080.01426: waiting for pending results... 10669 1726773080.01544: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 10669 1726773080.01656: in run() - task 0affffe7-6841-7dd6-8fa6-000000000411 10669 1726773080.01672: variable 'ansible_search_path' from source: unknown 10669 1726773080.01676: variable 'ansible_search_path' from source: unknown 10669 1726773080.01705: calling self._execute() 10669 1726773080.01769: variable 'ansible_host' from source: host vars for 'managed_node3' 10669 1726773080.01778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10669 1726773080.01788: variable 'omit' from source: magic vars 10669 1726773080.02106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10669 1726773080.02439: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10669 1726773080.02481: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10669 1726773080.02514: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10669 1726773080.02546: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10669 1726773080.02657: variable '__kernel_settings_register_profile' from source: set_fact 10669 1726773080.02683: Evaluated conditional (not __kernel_settings_register_profile is changed): True 10669 1726773080.02813: variable '__kernel_settings_register_mode' from source: set_fact 10669 1726773080.02825: Evaluated conditional (not __kernel_settings_register_mode is changed): True 10669 1726773080.02929: variable '__kernel_settings_register_apply' from source: set_fact 10669 1726773080.02941: Evaluated conditional (__kernel_settings_register_apply is changed): False 10669 1726773080.02945: when evaluation is False, skipping this task 10669 1726773080.02949: _execute() done 10669 1726773080.02951: dumping result to json 10669 1726773080.02954: done dumping result, returning 10669 1726773080.02960: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings [0affffe7-6841-7dd6-8fa6-000000000411] 10669 1726773080.02965: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000411 10669 1726773080.02994: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000411 10669 1726773080.02998: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_register_apply is changed", "skip_reason": "Conditional result was False" } 9733 1726773080.03330: no more pending results, returning what we have 9733 1726773080.03333: results queue empty 9733 1726773080.03333: checking for any_errors_fatal 9733 1726773080.03340: done checking for any_errors_fatal 9733 1726773080.03340: checking for max_fail_percentage 9733 1726773080.03341: done checking for max_fail_percentage 9733 1726773080.03342: checking to see if all hosts have failed and the running result is not ok 9733 1726773080.03342: done checking to see if all hosts have failed 9733 1726773080.03343: getting the remaining hosts for this loop 9733 1726773080.03343: done getting the remaining hosts for this loop 9733 1726773080.03346: getting the next task for host managed_node3 9733 1726773080.03350: done getting next task for host managed_node3 9733 1726773080.03353: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings 9733 1726773080.03354: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773080.03368: getting variables 9733 1726773080.03369: in VariableManager get_vars() 9733 1726773080.03403: Calling all_inventory to load vars for managed_node3 9733 1726773080.03405: Calling groups_inventory to load vars for managed_node3 9733 1726773080.03407: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773080.03414: Calling all_plugins_play to load vars for managed_node3 9733 1726773080.03415: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773080.03417: Calling groups_plugins_play to load vars for managed_node3 9733 1726773080.03532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773080.03726: done with get_vars() 9733 1726773080.03733: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Verify settings] ************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:166 Thursday 19 September 2024 15:11:20 -0400 (0:00:00.025) 0:00:25.770 **** 9733 1726773080.03802: entering _queue_task() for managed_node3/include_tasks 9733 1726773080.03971: worker is 1 (out of 1 available) 9733 1726773080.03991: exiting _queue_task() for managed_node3/include_tasks 9733 1726773080.04003: done queuing things up, now waiting for results queue to drain 9733 1726773080.04005: waiting for pending results... 10673 1726773080.04127: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Verify settings 10673 1726773080.04251: in run() - task 0affffe7-6841-7dd6-8fa6-000000000412 10673 1726773080.04266: variable 'ansible_search_path' from source: unknown 10673 1726773080.04270: variable 'ansible_search_path' from source: unknown 10673 1726773080.04301: calling self._execute() 10673 1726773080.04368: variable 'ansible_host' from source: host vars for 'managed_node3' 10673 1726773080.04378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10673 1726773080.04388: variable 'omit' from source: magic vars 10673 1726773080.04710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10673 1726773080.04889: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10673 1726773080.04923: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10673 1726773080.04949: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10673 1726773080.04976: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10673 1726773080.05060: variable '__kernel_settings_register_apply' from source: set_fact 10673 1726773080.05088: Evaluated conditional (__kernel_settings_register_apply is changed): False 10673 1726773080.05093: when evaluation is False, skipping this task 10673 1726773080.05096: _execute() done 10673 1726773080.05099: dumping result to json 10673 1726773080.05103: done dumping result, returning 10673 1726773080.05109: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Verify settings [0affffe7-6841-7dd6-8fa6-000000000412] 10673 1726773080.05114: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000412 10673 1726773080.05132: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000412 10673 1726773080.05134: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_register_apply is changed", "skip_reason": "Conditional result was False" } 9733 1726773080.05366: no more pending results, returning what we have 9733 1726773080.05369: results queue empty 9733 1726773080.05369: checking for any_errors_fatal 9733 1726773080.05374: done checking for any_errors_fatal 9733 1726773080.05375: checking for max_fail_percentage 9733 1726773080.05376: done checking for max_fail_percentage 9733 1726773080.05376: checking to see if all hosts have failed and the running result is not ok 9733 1726773080.05376: done checking to see if all hosts have failed 9733 1726773080.05377: getting the remaining hosts for this loop 9733 1726773080.05377: done getting the remaining hosts for this loop 9733 1726773080.05382: getting the next task for host managed_node3 9733 1726773080.05388: done getting next task for host managed_node3 9733 1726773080.05391: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 9733 1726773080.05392: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773080.05403: getting variables 9733 1726773080.05404: in VariableManager get_vars() 9733 1726773080.05429: Calling all_inventory to load vars for managed_node3 9733 1726773080.05432: Calling groups_inventory to load vars for managed_node3 9733 1726773080.05434: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773080.05441: Calling all_plugins_play to load vars for managed_node3 9733 1726773080.05443: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773080.05444: Calling groups_plugins_play to load vars for managed_node3 9733 1726773080.05554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773080.05675: done with get_vars() 9733 1726773080.05687: done getting variables 9733 1726773080.05727: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:177 Thursday 19 September 2024 15:11:20 -0400 (0:00:00.019) 0:00:25.790 **** 9733 1726773080.05747: entering _queue_task() for managed_node3/set_fact 9733 1726773080.05911: worker is 1 (out of 1 available) 9733 1726773080.05926: exiting _queue_task() for managed_node3/set_fact 9733 1726773080.05939: done queuing things up, now waiting for results queue to drain 9733 1726773080.05941: waiting for pending results... 10674 1726773080.06061: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 10674 1726773080.06175: in run() - task 0affffe7-6841-7dd6-8fa6-000000000413 10674 1726773080.06191: variable 'ansible_search_path' from source: unknown 10674 1726773080.06194: variable 'ansible_search_path' from source: unknown 10674 1726773080.06220: calling self._execute() 10674 1726773080.06287: variable 'ansible_host' from source: host vars for 'managed_node3' 10674 1726773080.06295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10674 1726773080.06301: variable 'omit' from source: magic vars 10674 1726773080.06368: variable 'omit' from source: magic vars 10674 1726773080.06413: variable 'omit' from source: magic vars 10674 1726773080.06437: variable 'omit' from source: magic vars 10674 1726773080.06469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10674 1726773080.06499: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10674 1726773080.06517: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10674 1726773080.06532: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10674 1726773080.06544: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10674 1726773080.06567: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10674 1726773080.06572: variable 'ansible_host' from source: host vars for 'managed_node3' 10674 1726773080.06576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10674 1726773080.06649: Set connection var ansible_timeout to 10 10674 1726773080.06654: Set connection var ansible_shell_type to sh 10674 1726773080.06660: Set connection var ansible_module_compression to ZIP_DEFLATED 10674 1726773080.06666: Set connection var ansible_shell_executable to /bin/sh 10674 1726773080.06671: Set connection var ansible_pipelining to False 10674 1726773080.06678: Set connection var ansible_connection to ssh 10674 1726773080.06695: variable 'ansible_shell_executable' from source: unknown 10674 1726773080.06700: variable 'ansible_connection' from source: unknown 10674 1726773080.06704: variable 'ansible_module_compression' from source: unknown 10674 1726773080.06708: variable 'ansible_shell_type' from source: unknown 10674 1726773080.06710: variable 'ansible_shell_executable' from source: unknown 10674 1726773080.06712: variable 'ansible_host' from source: host vars for 'managed_node3' 10674 1726773080.06714: variable 'ansible_pipelining' from source: unknown 10674 1726773080.06716: variable 'ansible_timeout' from source: unknown 10674 1726773080.06718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10674 1726773080.06816: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10674 1726773080.06828: variable 'omit' from source: magic vars 10674 1726773080.06834: starting attempt loop 10674 1726773080.06838: running the handler 10674 1726773080.06847: handler run complete 10674 1726773080.06856: attempt loop complete, returning result 10674 1726773080.06860: _execute() done 10674 1726773080.06863: dumping result to json 10674 1726773080.06867: done dumping result, returning 10674 1726773080.06873: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes [0affffe7-6841-7dd6-8fa6-000000000413] 10674 1726773080.06879: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000413 10674 1726773080.06905: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000413 10674 1726773080.06909: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "kernel_settings_reboot_required": false }, "changed": false } 9733 1726773080.07026: no more pending results, returning what we have 9733 1726773080.07028: results queue empty 9733 1726773080.07029: checking for any_errors_fatal 9733 1726773080.07035: done checking for any_errors_fatal 9733 1726773080.07035: checking for max_fail_percentage 9733 1726773080.07037: done checking for max_fail_percentage 9733 1726773080.07037: checking to see if all hosts have failed and the running result is not ok 9733 1726773080.07038: done checking to see if all hosts have failed 9733 1726773080.07038: getting the remaining hosts for this loop 9733 1726773080.07039: done getting the remaining hosts for this loop 9733 1726773080.07042: getting the next task for host managed_node3 9733 1726773080.07047: done getting next task for host managed_node3 9733 1726773080.07050: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 9733 1726773080.07052: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773080.07061: getting variables 9733 1726773080.07062: in VariableManager get_vars() 9733 1726773080.07096: Calling all_inventory to load vars for managed_node3 9733 1726773080.07098: Calling groups_inventory to load vars for managed_node3 9733 1726773080.07099: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773080.07106: Calling all_plugins_play to load vars for managed_node3 9733 1726773080.07107: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773080.07109: Calling groups_plugins_play to load vars for managed_node3 9733 1726773080.07251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773080.07366: done with get_vars() 9733 1726773080.07373: done getting variables 9733 1726773080.07416: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:181 Thursday 19 September 2024 15:11:20 -0400 (0:00:00.016) 0:00:25.806 **** 9733 1726773080.07435: entering _queue_task() for managed_node3/set_fact 9733 1726773080.07589: worker is 1 (out of 1 available) 9733 1726773080.07604: exiting _queue_task() for managed_node3/set_fact 9733 1726773080.07616: done queuing things up, now waiting for results queue to drain 9733 1726773080.07618: waiting for pending results... 10675 1726773080.07730: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 10675 1726773080.07840: in run() - task 0affffe7-6841-7dd6-8fa6-000000000414 10675 1726773080.07855: variable 'ansible_search_path' from source: unknown 10675 1726773080.07859: variable 'ansible_search_path' from source: unknown 10675 1726773080.07887: calling self._execute() 10675 1726773080.07953: variable 'ansible_host' from source: host vars for 'managed_node3' 10675 1726773080.07962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10675 1726773080.07970: variable 'omit' from source: magic vars 10675 1726773080.08041: variable 'omit' from source: magic vars 10675 1726773080.08077: variable 'omit' from source: magic vars 10675 1726773080.08331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10675 1726773080.08506: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10675 1726773080.08540: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10675 1726773080.08568: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10675 1726773080.08597: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10675 1726773080.08702: variable '__kernel_settings_register_profile' from source: set_fact 10675 1726773080.08715: variable '__kernel_settings_register_mode' from source: set_fact 10675 1726773080.08721: variable '__kernel_settings_register_apply' from source: set_fact 10675 1726773080.08756: variable 'omit' from source: magic vars 10675 1726773080.08774: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10675 1726773080.08806: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10675 1726773080.08823: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10675 1726773080.08836: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10675 1726773080.08845: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10675 1726773080.08867: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10675 1726773080.08872: variable 'ansible_host' from source: host vars for 'managed_node3' 10675 1726773080.08876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10675 1726773080.08944: Set connection var ansible_timeout to 10 10675 1726773080.08949: Set connection var ansible_shell_type to sh 10675 1726773080.08955: Set connection var ansible_module_compression to ZIP_DEFLATED 10675 1726773080.08960: Set connection var ansible_shell_executable to /bin/sh 10675 1726773080.08966: Set connection var ansible_pipelining to False 10675 1726773080.08972: Set connection var ansible_connection to ssh 10675 1726773080.08989: variable 'ansible_shell_executable' from source: unknown 10675 1726773080.08993: variable 'ansible_connection' from source: unknown 10675 1726773080.08997: variable 'ansible_module_compression' from source: unknown 10675 1726773080.09000: variable 'ansible_shell_type' from source: unknown 10675 1726773080.09003: variable 'ansible_shell_executable' from source: unknown 10675 1726773080.09005: variable 'ansible_host' from source: host vars for 'managed_node3' 10675 1726773080.09008: variable 'ansible_pipelining' from source: unknown 10675 1726773080.09009: variable 'ansible_timeout' from source: unknown 10675 1726773080.09011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10675 1726773080.09073: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10675 1726773080.09083: variable 'omit' from source: magic vars 10675 1726773080.09090: starting attempt loop 10675 1726773080.09092: running the handler 10675 1726773080.09100: handler run complete 10675 1726773080.09105: attempt loop complete, returning result 10675 1726773080.09107: _execute() done 10675 1726773080.09109: dumping result to json 10675 1726773080.09112: done dumping result, returning 10675 1726773080.09116: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing [0affffe7-6841-7dd6-8fa6-000000000414] 10675 1726773080.09120: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000414 10675 1726773080.09136: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000414 10675 1726773080.09138: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__kernel_settings_changed": false }, "changed": false } 9733 1726773080.09399: no more pending results, returning what we have 9733 1726773080.09401: results queue empty 9733 1726773080.09401: checking for any_errors_fatal 9733 1726773080.09405: done checking for any_errors_fatal 9733 1726773080.09405: checking for max_fail_percentage 9733 1726773080.09406: done checking for max_fail_percentage 9733 1726773080.09407: checking to see if all hosts have failed and the running result is not ok 9733 1726773080.09407: done checking to see if all hosts have failed 9733 1726773080.09407: getting the remaining hosts for this loop 9733 1726773080.09408: done getting the remaining hosts for this loop 9733 1726773080.09410: getting the next task for host managed_node3 9733 1726773080.09416: done getting next task for host managed_node3 9733 1726773080.09417: ^ task is: TASK: meta (role_complete) 9733 1726773080.09418: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773080.09425: getting variables 9733 1726773080.09426: in VariableManager get_vars() 9733 1726773080.09451: Calling all_inventory to load vars for managed_node3 9733 1726773080.09454: Calling groups_inventory to load vars for managed_node3 9733 1726773080.09455: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773080.09462: Calling all_plugins_play to load vars for managed_node3 9733 1726773080.09463: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773080.09465: Calling groups_plugins_play to load vars for managed_node3 9733 1726773080.09570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773080.09691: done with get_vars() 9733 1726773080.09699: done getting variables 9733 1726773080.09749: done queuing things up, now waiting for results queue to drain 9733 1726773080.09750: results queue empty 9733 1726773080.09751: checking for any_errors_fatal 9733 1726773080.09753: done checking for any_errors_fatal 9733 1726773080.09754: checking for max_fail_percentage 9733 1726773080.09754: done checking for max_fail_percentage 9733 1726773080.09758: checking to see if all hosts have failed and the running result is not ok 9733 1726773080.09759: done checking to see if all hosts have failed 9733 1726773080.09759: getting the remaining hosts for this loop 9733 1726773080.09760: done getting the remaining hosts for this loop 9733 1726773080.09762: getting the next task for host managed_node3 9733 1726773080.09765: done getting next task for host managed_node3 9733 1726773080.09766: ^ task is: TASK: Ensure role reported not changed 9733 1726773080.09767: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773080.09768: getting variables 9733 1726773080.09768: in VariableManager get_vars() 9733 1726773080.09775: Calling all_inventory to load vars for managed_node3 9733 1726773080.09777: Calling groups_inventory to load vars for managed_node3 9733 1726773080.09778: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773080.09783: Calling all_plugins_play to load vars for managed_node3 9733 1726773080.09784: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773080.09788: Calling groups_plugins_play to load vars for managed_node3 9733 1726773080.09860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773080.09993: done with get_vars() 9733 1726773080.09998: done getting variables 9733 1726773080.10021: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure role reported not changed] **************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_simple_settings.yml:67 Thursday 19 September 2024 15:11:20 -0400 (0:00:00.026) 0:00:25.832 **** 9733 1726773080.10038: entering _queue_task() for managed_node3/assert 9733 1726773080.10200: worker is 1 (out of 1 available) 9733 1726773080.10216: exiting _queue_task() for managed_node3/assert 9733 1726773080.10227: done queuing things up, now waiting for results queue to drain 9733 1726773080.10230: waiting for pending results... 10676 1726773080.10342: running TaskExecutor() for managed_node3/TASK: Ensure role reported not changed 10676 1726773080.10440: in run() - task 0affffe7-6841-7dd6-8fa6-00000000000f 10676 1726773080.10456: variable 'ansible_search_path' from source: unknown 10676 1726773080.10487: calling self._execute() 10676 1726773080.10554: variable 'ansible_host' from source: host vars for 'managed_node3' 10676 1726773080.10563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10676 1726773080.10572: variable 'omit' from source: magic vars 10676 1726773080.10644: variable 'omit' from source: magic vars 10676 1726773080.10668: variable 'omit' from source: magic vars 10676 1726773080.10695: variable 'omit' from source: magic vars 10676 1726773080.10728: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10676 1726773080.10756: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10676 1726773080.10775: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10676 1726773080.10793: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10676 1726773080.10805: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10676 1726773080.10828: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10676 1726773080.10833: variable 'ansible_host' from source: host vars for 'managed_node3' 10676 1726773080.10837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10676 1726773080.10908: Set connection var ansible_timeout to 10 10676 1726773080.10913: Set connection var ansible_shell_type to sh 10676 1726773080.10919: Set connection var ansible_module_compression to ZIP_DEFLATED 10676 1726773080.10925: Set connection var ansible_shell_executable to /bin/sh 10676 1726773080.10930: Set connection var ansible_pipelining to False 10676 1726773080.10937: Set connection var ansible_connection to ssh 10676 1726773080.10953: variable 'ansible_shell_executable' from source: unknown 10676 1726773080.10958: variable 'ansible_connection' from source: unknown 10676 1726773080.10961: variable 'ansible_module_compression' from source: unknown 10676 1726773080.10965: variable 'ansible_shell_type' from source: unknown 10676 1726773080.10968: variable 'ansible_shell_executable' from source: unknown 10676 1726773080.10972: variable 'ansible_host' from source: host vars for 'managed_node3' 10676 1726773080.10977: variable 'ansible_pipelining' from source: unknown 10676 1726773080.10980: variable 'ansible_timeout' from source: unknown 10676 1726773080.10984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10676 1726773080.11080: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10676 1726773080.11093: variable 'omit' from source: magic vars 10676 1726773080.11098: starting attempt loop 10676 1726773080.11100: running the handler 10676 1726773080.11358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10676 1726773080.12918: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10676 1726773080.12974: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10676 1726773080.13004: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10676 1726773080.13033: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10676 1726773080.13053: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10676 1726773080.13103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10676 1726773080.13126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10676 1726773080.13145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10676 1726773080.13172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10676 1726773080.13187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10676 1726773080.13268: variable '__kernel_settings_changed' from source: set_fact 10676 1726773080.13286: Evaluated conditional (not __kernel_settings_changed | d(false)): True 10676 1726773080.13293: handler run complete 10676 1726773080.13310: attempt loop complete, returning result 10676 1726773080.13314: _execute() done 10676 1726773080.13317: dumping result to json 10676 1726773080.13321: done dumping result, returning 10676 1726773080.13328: done running TaskExecutor() for managed_node3/TASK: Ensure role reported not changed [0affffe7-6841-7dd6-8fa6-00000000000f] 10676 1726773080.13333: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000000f 10676 1726773080.13357: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000000f 10676 1726773080.13360: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 9733 1726773080.13468: no more pending results, returning what we have 9733 1726773080.13471: results queue empty 9733 1726773080.13472: checking for any_errors_fatal 9733 1726773080.13473: done checking for any_errors_fatal 9733 1726773080.13474: checking for max_fail_percentage 9733 1726773080.13475: done checking for max_fail_percentage 9733 1726773080.13476: checking to see if all hosts have failed and the running result is not ok 9733 1726773080.13476: done checking to see if all hosts have failed 9733 1726773080.13477: getting the remaining hosts for this loop 9733 1726773080.13478: done getting the remaining hosts for this loop 9733 1726773080.13483: getting the next task for host managed_node3 9733 1726773080.13490: done getting next task for host managed_node3 9733 1726773080.13492: ^ task is: TASK: Cleanup 9733 1726773080.13494: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773080.13497: getting variables 9733 1726773080.13498: in VariableManager get_vars() 9733 1726773080.13531: Calling all_inventory to load vars for managed_node3 9733 1726773080.13534: Calling groups_inventory to load vars for managed_node3 9733 1726773080.13535: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773080.13545: Calling all_plugins_play to load vars for managed_node3 9733 1726773080.13547: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773080.13556: Calling groups_plugins_play to load vars for managed_node3 9733 1726773080.13694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773080.13812: done with get_vars() 9733 1726773080.13821: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_simple_settings.yml:72 Thursday 19 September 2024 15:11:20 -0400 (0:00:00.038) 0:00:25.871 **** 9733 1726773080.13887: entering _queue_task() for managed_node3/include_tasks 9733 1726773080.14049: worker is 1 (out of 1 available) 9733 1726773080.14066: exiting _queue_task() for managed_node3/include_tasks 9733 1726773080.14079: done queuing things up, now waiting for results queue to drain 9733 1726773080.14083: waiting for pending results... 10677 1726773080.14200: running TaskExecutor() for managed_node3/TASK: Cleanup 10677 1726773080.14299: in run() - task 0affffe7-6841-7dd6-8fa6-000000000010 10677 1726773080.14314: variable 'ansible_search_path' from source: unknown 10677 1726773080.14344: calling self._execute() 10677 1726773080.14416: variable 'ansible_host' from source: host vars for 'managed_node3' 10677 1726773080.14424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10677 1726773080.14432: variable 'omit' from source: magic vars 10677 1726773080.14499: _execute() done 10677 1726773080.14504: dumping result to json 10677 1726773080.14507: done dumping result, returning 10677 1726773080.14511: done running TaskExecutor() for managed_node3/TASK: Cleanup [0affffe7-6841-7dd6-8fa6-000000000010] 10677 1726773080.14516: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000010 10677 1726773080.14538: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000010 10677 1726773080.14540: WORKER PROCESS EXITING 9733 1726773080.14688: no more pending results, returning what we have 9733 1726773080.14691: in VariableManager get_vars() 9733 1726773080.14718: Calling all_inventory to load vars for managed_node3 9733 1726773080.14720: Calling groups_inventory to load vars for managed_node3 9733 1726773080.14721: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773080.14728: Calling all_plugins_play to load vars for managed_node3 9733 1726773080.14730: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773080.14732: Calling groups_plugins_play to load vars for managed_node3 9733 1726773080.14840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773080.14993: done with get_vars() 9733 1726773080.14999: variable 'ansible_search_path' from source: unknown 9733 1726773080.15008: we have included files to process 9733 1726773080.15009: generating all_blocks data 9733 1726773080.15012: done generating all_blocks data 9733 1726773080.15015: processing included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml 9733 1726773080.15016: loading included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml 9733 1726773080.15017: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml included: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml for managed_node3 9733 1726773080.15621: done processing included file 9733 1726773080.15623: iterating over new_blocks loaded from include file 9733 1726773080.15624: in VariableManager get_vars() 9733 1726773080.15633: done with get_vars() 9733 1726773080.15634: filtering new block on tags 9733 1726773080.15649: done filtering new block on tags 9733 1726773080.15651: in VariableManager get_vars() 9733 1726773080.15659: done with get_vars() 9733 1726773080.15660: filtering new block on tags 9733 1726773080.15717: done filtering new block on tags 9733 1726773080.15719: done iterating over new_blocks loaded from include file 9733 1726773080.15719: extending task lists for all hosts with included blocks 9733 1726773080.17137: done extending task lists 9733 1726773080.17138: done processing included files 9733 1726773080.17139: results queue empty 9733 1726773080.17139: checking for any_errors_fatal 9733 1726773080.17141: done checking for any_errors_fatal 9733 1726773080.17142: checking for max_fail_percentage 9733 1726773080.17142: done checking for max_fail_percentage 9733 1726773080.17143: checking to see if all hosts have failed and the running result is not ok 9733 1726773080.17143: done checking to see if all hosts have failed 9733 1726773080.17144: getting the remaining hosts for this loop 9733 1726773080.17144: done getting the remaining hosts for this loop 9733 1726773080.17146: getting the next task for host managed_node3 9733 1726773080.17148: done getting next task for host managed_node3 9733 1726773080.17150: ^ task is: TASK: Show current tuned profile settings 9733 1726773080.17151: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773080.17153: getting variables 9733 1726773080.17154: in VariableManager get_vars() 9733 1726773080.17162: Calling all_inventory to load vars for managed_node3 9733 1726773080.17163: Calling groups_inventory to load vars for managed_node3 9733 1726773080.17164: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773080.17168: Calling all_plugins_play to load vars for managed_node3 9733 1726773080.17169: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773080.17171: Calling groups_plugins_play to load vars for managed_node3 9733 1726773080.17263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773080.17368: done with get_vars() 9733 1726773080.17375: done getting variables 9733 1726773080.17404: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current tuned profile settings] ************************************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:2 Thursday 19 September 2024 15:11:20 -0400 (0:00:00.035) 0:00:25.906 **** 9733 1726773080.17423: entering _queue_task() for managed_node3/command 9733 1726773080.17597: worker is 1 (out of 1 available) 9733 1726773080.17611: exiting _queue_task() for managed_node3/command 9733 1726773080.17623: done queuing things up, now waiting for results queue to drain 9733 1726773080.17625: waiting for pending results... 10678 1726773080.17742: running TaskExecutor() for managed_node3/TASK: Show current tuned profile settings 10678 1726773080.17846: in run() - task 0affffe7-6841-7dd6-8fa6-000000000588 10678 1726773080.17864: variable 'ansible_search_path' from source: unknown 10678 1726773080.17867: variable 'ansible_search_path' from source: unknown 10678 1726773080.17896: calling self._execute() 10678 1726773080.17967: variable 'ansible_host' from source: host vars for 'managed_node3' 10678 1726773080.17976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10678 1726773080.17986: variable 'omit' from source: magic vars 10678 1726773080.18057: variable 'omit' from source: magic vars 10678 1726773080.18088: variable 'omit' from source: magic vars 10678 1726773080.18318: variable '__kernel_settings_profile_filename' from source: role '' exported vars 10678 1726773080.18375: variable '__kernel_settings_profile_dir' from source: role '' exported vars 10678 1726773080.18438: variable '__kernel_settings_profile_parent' from source: set_fact 10678 1726773080.18447: variable '__kernel_settings_tuned_profile' from source: role '' exported vars 10678 1726773080.18481: variable 'omit' from source: magic vars 10678 1726773080.18515: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10678 1726773080.18541: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10678 1726773080.18559: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10678 1726773080.18574: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10678 1726773080.18588: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10678 1726773080.18613: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10678 1726773080.18618: variable 'ansible_host' from source: host vars for 'managed_node3' 10678 1726773080.18623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10678 1726773080.18695: Set connection var ansible_timeout to 10 10678 1726773080.18700: Set connection var ansible_shell_type to sh 10678 1726773080.18706: Set connection var ansible_module_compression to ZIP_DEFLATED 10678 1726773080.18711: Set connection var ansible_shell_executable to /bin/sh 10678 1726773080.18717: Set connection var ansible_pipelining to False 10678 1726773080.18724: Set connection var ansible_connection to ssh 10678 1726773080.18739: variable 'ansible_shell_executable' from source: unknown 10678 1726773080.18743: variable 'ansible_connection' from source: unknown 10678 1726773080.18746: variable 'ansible_module_compression' from source: unknown 10678 1726773080.18750: variable 'ansible_shell_type' from source: unknown 10678 1726773080.18754: variable 'ansible_shell_executable' from source: unknown 10678 1726773080.18758: variable 'ansible_host' from source: host vars for 'managed_node3' 10678 1726773080.18762: variable 'ansible_pipelining' from source: unknown 10678 1726773080.18765: variable 'ansible_timeout' from source: unknown 10678 1726773080.18770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10678 1726773080.18861: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10678 1726773080.18872: variable 'omit' from source: magic vars 10678 1726773080.18878: starting attempt loop 10678 1726773080.18882: running the handler 10678 1726773080.18896: _low_level_execute_command(): starting 10678 1726773080.18906: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10678 1726773080.21330: stdout chunk (state=2): >>>/root <<< 10678 1726773080.21447: stderr chunk (state=3): >>><<< 10678 1726773080.21454: stdout chunk (state=3): >>><<< 10678 1726773080.21472: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10678 1726773080.21486: _low_level_execute_command(): starting 10678 1726773080.21493: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773080.2147958-10678-27103629051929 `" && echo ansible-tmp-1726773080.2147958-10678-27103629051929="` echo /root/.ansible/tmp/ansible-tmp-1726773080.2147958-10678-27103629051929 `" ) && sleep 0' 10678 1726773080.24038: stdout chunk (state=2): >>>ansible-tmp-1726773080.2147958-10678-27103629051929=/root/.ansible/tmp/ansible-tmp-1726773080.2147958-10678-27103629051929 <<< 10678 1726773080.24170: stderr chunk (state=3): >>><<< 10678 1726773080.24177: stdout chunk (state=3): >>><<< 10678 1726773080.24196: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773080.2147958-10678-27103629051929=/root/.ansible/tmp/ansible-tmp-1726773080.2147958-10678-27103629051929 , stderr= 10678 1726773080.24221: variable 'ansible_module_compression' from source: unknown 10678 1726773080.24259: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10678 1726773080.24290: variable 'ansible_facts' from source: unknown 10678 1726773080.24362: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773080.2147958-10678-27103629051929/AnsiballZ_command.py 10678 1726773080.24463: Sending initial data 10678 1726773080.24470: Sent initial data (154 bytes) 10678 1726773080.27072: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpt317_bly /root/.ansible/tmp/ansible-tmp-1726773080.2147958-10678-27103629051929/AnsiballZ_command.py <<< 10678 1726773080.28246: stderr chunk (state=3): >>><<< 10678 1726773080.28255: stdout chunk (state=3): >>><<< 10678 1726773080.28276: done transferring module to remote 10678 1726773080.28290: _low_level_execute_command(): starting 10678 1726773080.28296: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773080.2147958-10678-27103629051929/ /root/.ansible/tmp/ansible-tmp-1726773080.2147958-10678-27103629051929/AnsiballZ_command.py && sleep 0' 10678 1726773080.30780: stderr chunk (state=2): >>><<< 10678 1726773080.30793: stdout chunk (state=2): >>><<< 10678 1726773080.30808: _low_level_execute_command() done: rc=0, stdout=, stderr= 10678 1726773080.30812: _low_level_execute_command(): starting 10678 1726773080.30819: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773080.2147958-10678-27103629051929/AnsiballZ_command.py && sleep 0' 10678 1726773080.46531: stdout chunk (state=2): >>> {"changed": true, "stdout": "#\n# Ansible managed\n#\n# system_role:kernel_settings\n\n[main]\nsummary = kernel settings\n[sysctl]\nfs.epoll.max_user_watches = 785592\nfs.file-max = 379724\n[sysfs]\n/sys/kernel/debug/x86/ibrs_enabled = 0\n/sys/kernel/debug/x86/pti_enabled = 0\n/sys/kernel/debug/x86/retp_enabled = 0\n[vm]\ntransparent_hugepages = madvise", "stderr": "", "rc": 0, "cmd": ["cat", "/etc/tuned/kernel_settings/tuned.conf"], "start": "2024-09-19 15:11:20.460496", "end": "2024-09-19 15:11:20.463291", "delta": "0:00:00.002795", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /etc/tuned/kernel_settings/tuned.conf", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10678 1726773080.47710: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10678 1726773080.47761: stderr chunk (state=3): >>><<< 10678 1726773080.47769: stdout chunk (state=3): >>><<< 10678 1726773080.47788: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "#\n# Ansible managed\n#\n# system_role:kernel_settings\n\n[main]\nsummary = kernel settings\n[sysctl]\nfs.epoll.max_user_watches = 785592\nfs.file-max = 379724\n[sysfs]\n/sys/kernel/debug/x86/ibrs_enabled = 0\n/sys/kernel/debug/x86/pti_enabled = 0\n/sys/kernel/debug/x86/retp_enabled = 0\n[vm]\ntransparent_hugepages = madvise", "stderr": "", "rc": 0, "cmd": ["cat", "/etc/tuned/kernel_settings/tuned.conf"], "start": "2024-09-19 15:11:20.460496", "end": "2024-09-19 15:11:20.463291", "delta": "0:00:00.002795", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /etc/tuned/kernel_settings/tuned.conf", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.47.99 closed. 10678 1726773080.47821: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773080.2147958-10678-27103629051929/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10678 1726773080.47831: _low_level_execute_command(): starting 10678 1726773080.47837: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773080.2147958-10678-27103629051929/ > /dev/null 2>&1 && sleep 0' 10678 1726773080.50358: stderr chunk (state=2): >>><<< 10678 1726773080.50369: stdout chunk (state=2): >>><<< 10678 1726773080.50389: _low_level_execute_command() done: rc=0, stdout=, stderr= 10678 1726773080.50397: handler run complete 10678 1726773080.50414: Evaluated conditional (False): False 10678 1726773080.50424: attempt loop complete, returning result 10678 1726773080.50428: _execute() done 10678 1726773080.50432: dumping result to json 10678 1726773080.50438: done dumping result, returning 10678 1726773080.50445: done running TaskExecutor() for managed_node3/TASK: Show current tuned profile settings [0affffe7-6841-7dd6-8fa6-000000000588] 10678 1726773080.50451: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000588 10678 1726773080.50482: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000588 10678 1726773080.50490: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "cat", "/etc/tuned/kernel_settings/tuned.conf" ], "delta": "0:00:00.002795", "end": "2024-09-19 15:11:20.463291", "rc": 0, "start": "2024-09-19 15:11:20.460496" } STDOUT: # # Ansible managed # # system_role:kernel_settings [main] summary = kernel settings [sysctl] fs.epoll.max_user_watches = 785592 fs.file-max = 379724 [sysfs] /sys/kernel/debug/x86/ibrs_enabled = 0 /sys/kernel/debug/x86/pti_enabled = 0 /sys/kernel/debug/x86/retp_enabled = 0 [vm] transparent_hugepages = madvise 9733 1726773080.50740: no more pending results, returning what we have 9733 1726773080.50742: results queue empty 9733 1726773080.50743: checking for any_errors_fatal 9733 1726773080.50744: done checking for any_errors_fatal 9733 1726773080.50745: checking for max_fail_percentage 9733 1726773080.50746: done checking for max_fail_percentage 9733 1726773080.50746: checking to see if all hosts have failed and the running result is not ok 9733 1726773080.50747: done checking to see if all hosts have failed 9733 1726773080.50747: getting the remaining hosts for this loop 9733 1726773080.50748: done getting the remaining hosts for this loop 9733 1726773080.50750: getting the next task for host managed_node3 9733 1726773080.50757: done getting next task for host managed_node3 9733 1726773080.50759: ^ task is: TASK: Run role with purge to remove everything 9733 1726773080.50761: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773080.50763: getting variables 9733 1726773080.50764: in VariableManager get_vars() 9733 1726773080.50795: Calling all_inventory to load vars for managed_node3 9733 1726773080.50797: Calling groups_inventory to load vars for managed_node3 9733 1726773080.50799: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773080.50808: Calling all_plugins_play to load vars for managed_node3 9733 1726773080.50810: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773080.50811: Calling groups_plugins_play to load vars for managed_node3 9733 1726773080.50922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773080.51041: done with get_vars() 9733 1726773080.51049: done getting variables TASK [Run role with purge to remove everything] ******************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:9 Thursday 19 September 2024 15:11:20 -0400 (0:00:00.336) 0:00:26.243 **** 9733 1726773080.51121: entering _queue_task() for managed_node3/include_role 9733 1726773080.51303: worker is 1 (out of 1 available) 9733 1726773080.51318: exiting _queue_task() for managed_node3/include_role 9733 1726773080.51329: done queuing things up, now waiting for results queue to drain 9733 1726773080.51331: waiting for pending results... 10686 1726773080.51460: running TaskExecutor() for managed_node3/TASK: Run role with purge to remove everything 10686 1726773080.51575: in run() - task 0affffe7-6841-7dd6-8fa6-00000000058a 10686 1726773080.51595: variable 'ansible_search_path' from source: unknown 10686 1726773080.51600: variable 'ansible_search_path' from source: unknown 10686 1726773080.51628: calling self._execute() 10686 1726773080.51706: variable 'ansible_host' from source: host vars for 'managed_node3' 10686 1726773080.51715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10686 1726773080.51724: variable 'omit' from source: magic vars 10686 1726773080.51806: _execute() done 10686 1726773080.51812: dumping result to json 10686 1726773080.51816: done dumping result, returning 10686 1726773080.51822: done running TaskExecutor() for managed_node3/TASK: Run role with purge to remove everything [0affffe7-6841-7dd6-8fa6-00000000058a] 10686 1726773080.51830: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000058a 10686 1726773080.51860: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000058a 10686 1726773080.51863: WORKER PROCESS EXITING 9733 1726773080.51967: no more pending results, returning what we have 9733 1726773080.51972: in VariableManager get_vars() 9733 1726773080.52010: Calling all_inventory to load vars for managed_node3 9733 1726773080.52012: Calling groups_inventory to load vars for managed_node3 9733 1726773080.52014: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773080.52025: Calling all_plugins_play to load vars for managed_node3 9733 1726773080.52028: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773080.52030: Calling groups_plugins_play to load vars for managed_node3 9733 1726773080.52193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773080.52305: done with get_vars() 9733 1726773080.52310: variable 'ansible_search_path' from source: unknown 9733 1726773080.52311: variable 'ansible_search_path' from source: unknown 9733 1726773080.52500: variable 'omit' from source: magic vars 9733 1726773080.52523: variable 'omit' from source: magic vars 9733 1726773080.52532: variable 'omit' from source: magic vars 9733 1726773080.52534: we have included files to process 9733 1726773080.52535: generating all_blocks data 9733 1726773080.52536: done generating all_blocks data 9733 1726773080.52539: processing included file: fedora.linux_system_roles.kernel_settings 9733 1726773080.52552: in VariableManager get_vars() 9733 1726773080.52563: done with get_vars() 9733 1726773080.52583: in VariableManager get_vars() 9733 1726773080.52600: done with get_vars() 9733 1726773080.52629: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/main.yml 9733 1726773080.52667: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/defaults/main.yml 9733 1726773080.52682: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/meta/main.yml 9733 1726773080.52732: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml 9733 1726773080.53071: in VariableManager get_vars() 9733 1726773080.53090: done with get_vars() 9733 1726773080.53917: in VariableManager get_vars() 9733 1726773080.53932: done with get_vars() 9733 1726773080.54038: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/handlers/main.yml 9733 1726773080.54420: iterating over new_blocks loaded from include file 9733 1726773080.54421: in VariableManager get_vars() 9733 1726773080.54434: done with get_vars() 9733 1726773080.54435: filtering new block on tags 9733 1726773080.54623: done filtering new block on tags 9733 1726773080.54626: in VariableManager get_vars() 9733 1726773080.54636: done with get_vars() 9733 1726773080.54637: filtering new block on tags 9733 1726773080.54662: done filtering new block on tags 9733 1726773080.54663: in VariableManager get_vars() 9733 1726773080.54672: done with get_vars() 9733 1726773080.54674: filtering new block on tags 9733 1726773080.54765: done filtering new block on tags 9733 1726773080.54767: in VariableManager get_vars() 9733 1726773080.54777: done with get_vars() 9733 1726773080.54778: filtering new block on tags 9733 1726773080.54791: done filtering new block on tags 9733 1726773080.54793: done iterating over new_blocks loaded from include file 9733 1726773080.54793: extending task lists for all hosts with included blocks 9733 1726773080.54972: done extending task lists 9733 1726773080.54973: done processing included files 9733 1726773080.54973: results queue empty 9733 1726773080.54973: checking for any_errors_fatal 9733 1726773080.54976: done checking for any_errors_fatal 9733 1726773080.54977: checking for max_fail_percentage 9733 1726773080.54977: done checking for max_fail_percentage 9733 1726773080.54978: checking to see if all hosts have failed and the running result is not ok 9733 1726773080.54978: done checking to see if all hosts have failed 9733 1726773080.54979: getting the remaining hosts for this loop 9733 1726773080.54979: done getting the remaining hosts for this loop 9733 1726773080.54981: getting the next task for host managed_node3 9733 1726773080.54983: done getting next task for host managed_node3 9733 1726773080.54987: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 9733 1726773080.54989: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773080.54995: getting variables 9733 1726773080.54996: in VariableManager get_vars() 9733 1726773080.55005: Calling all_inventory to load vars for managed_node3 9733 1726773080.55007: Calling groups_inventory to load vars for managed_node3 9733 1726773080.55008: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773080.55011: Calling all_plugins_play to load vars for managed_node3 9733 1726773080.55013: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773080.55014: Calling groups_plugins_play to load vars for managed_node3 9733 1726773080.55093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773080.55202: done with get_vars() 9733 1726773080.55209: done getting variables 9733 1726773080.55236: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:2 Thursday 19 September 2024 15:11:20 -0400 (0:00:00.041) 0:00:26.285 **** 9733 1726773080.55261: entering _queue_task() for managed_node3/fail 9733 1726773080.55462: worker is 1 (out of 1 available) 9733 1726773080.55477: exiting _queue_task() for managed_node3/fail 9733 1726773080.55492: done queuing things up, now waiting for results queue to drain 9733 1726773080.55494: waiting for pending results... 10687 1726773080.55630: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 10687 1726773080.55767: in run() - task 0affffe7-6841-7dd6-8fa6-0000000006d8 10687 1726773080.55788: variable 'ansible_search_path' from source: unknown 10687 1726773080.55794: variable 'ansible_search_path' from source: unknown 10687 1726773080.55825: calling self._execute() 10687 1726773080.55903: variable 'ansible_host' from source: host vars for 'managed_node3' 10687 1726773080.55912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10687 1726773080.55921: variable 'omit' from source: magic vars 10687 1726773080.56294: variable 'kernel_settings_sysctl' from source: include params 10687 1726773080.56306: variable '__kernel_settings_state_empty' from source: role '' all vars 10687 1726773080.56317: Evaluated conditional (kernel_settings_sysctl != __kernel_settings_state_empty): True 10687 1726773080.56528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10687 1726773080.58105: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10687 1726773080.58160: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10687 1726773080.58194: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10687 1726773080.58233: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10687 1726773080.58254: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10687 1726773080.58313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10687 1726773080.58338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10687 1726773080.58358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10687 1726773080.58390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10687 1726773080.58402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10687 1726773080.58443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10687 1726773080.58461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10687 1726773080.58478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10687 1726773080.58508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10687 1726773080.58519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10687 1726773080.58550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10687 1726773080.58568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10687 1726773080.58588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10687 1726773080.58614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10687 1726773080.58625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10687 1726773080.58815: variable 'kernel_settings_sysctl' from source: include params 10687 1726773080.58837: Evaluated conditional ((kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", true) | list | length > 0) or (kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", false) | list | length > 0)): False 10687 1726773080.58842: when evaluation is False, skipping this task 10687 1726773080.58845: _execute() done 10687 1726773080.58849: dumping result to json 10687 1726773080.58853: done dumping result, returning 10687 1726773080.58860: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values [0affffe7-6841-7dd6-8fa6-0000000006d8] 10687 1726773080.58865: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006d8 10687 1726773080.58894: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006d8 10687 1726773080.58897: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "(kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", true) | list | length > 0) or (kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", false) | list | length > 0)", "skip_reason": "Conditional result was False" } 9733 1726773080.59020: no more pending results, returning what we have 9733 1726773080.59023: results queue empty 9733 1726773080.59023: checking for any_errors_fatal 9733 1726773080.59025: done checking for any_errors_fatal 9733 1726773080.59025: checking for max_fail_percentage 9733 1726773080.59027: done checking for max_fail_percentage 9733 1726773080.59028: checking to see if all hosts have failed and the running result is not ok 9733 1726773080.59028: done checking to see if all hosts have failed 9733 1726773080.59029: getting the remaining hosts for this loop 9733 1726773080.59030: done getting the remaining hosts for this loop 9733 1726773080.59032: getting the next task for host managed_node3 9733 1726773080.59038: done getting next task for host managed_node3 9733 1726773080.59042: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 9733 1726773080.59045: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773080.59065: getting variables 9733 1726773080.59067: in VariableManager get_vars() 9733 1726773080.59105: Calling all_inventory to load vars for managed_node3 9733 1726773080.59108: Calling groups_inventory to load vars for managed_node3 9733 1726773080.59110: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773080.59119: Calling all_plugins_play to load vars for managed_node3 9733 1726773080.59122: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773080.59124: Calling groups_plugins_play to load vars for managed_node3 9733 1726773080.59251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773080.59405: done with get_vars() 9733 1726773080.59413: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Set version specific variables] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:9 Thursday 19 September 2024 15:11:20 -0400 (0:00:00.042) 0:00:26.327 **** 9733 1726773080.59477: entering _queue_task() for managed_node3/include_tasks 9733 1726773080.59648: worker is 1 (out of 1 available) 9733 1726773080.59664: exiting _queue_task() for managed_node3/include_tasks 9733 1726773080.59676: done queuing things up, now waiting for results queue to drain 9733 1726773080.59678: waiting for pending results... 10688 1726773080.59810: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 10688 1726773080.59933: in run() - task 0affffe7-6841-7dd6-8fa6-0000000006d9 10688 1726773080.59949: variable 'ansible_search_path' from source: unknown 10688 1726773080.59953: variable 'ansible_search_path' from source: unknown 10688 1726773080.59982: calling self._execute() 10688 1726773080.60062: variable 'ansible_host' from source: host vars for 'managed_node3' 10688 1726773080.60071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10688 1726773080.60080: variable 'omit' from source: magic vars 10688 1726773080.60161: _execute() done 10688 1726773080.60167: dumping result to json 10688 1726773080.60170: done dumping result, returning 10688 1726773080.60177: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables [0affffe7-6841-7dd6-8fa6-0000000006d9] 10688 1726773080.60184: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006d9 10688 1726773080.60212: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006d9 10688 1726773080.60216: WORKER PROCESS EXITING 9733 1726773080.60320: no more pending results, returning what we have 9733 1726773080.60324: in VariableManager get_vars() 9733 1726773080.60361: Calling all_inventory to load vars for managed_node3 9733 1726773080.60364: Calling groups_inventory to load vars for managed_node3 9733 1726773080.60365: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773080.60375: Calling all_plugins_play to load vars for managed_node3 9733 1726773080.60377: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773080.60379: Calling groups_plugins_play to load vars for managed_node3 9733 1726773080.60498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773080.60625: done with get_vars() 9733 1726773080.60631: variable 'ansible_search_path' from source: unknown 9733 1726773080.60632: variable 'ansible_search_path' from source: unknown 9733 1726773080.60655: we have included files to process 9733 1726773080.60656: generating all_blocks data 9733 1726773080.60657: done generating all_blocks data 9733 1726773080.60661: processing included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 9733 1726773080.60662: loading included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 9733 1726773080.60664: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml included: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml for managed_node3 9733 1726773080.61127: done processing included file 9733 1726773080.61129: iterating over new_blocks loaded from include file 9733 1726773080.61130: in VariableManager get_vars() 9733 1726773080.61148: done with get_vars() 9733 1726773080.61149: filtering new block on tags 9733 1726773080.61169: done filtering new block on tags 9733 1726773080.61195: in VariableManager get_vars() 9733 1726773080.61211: done with get_vars() 9733 1726773080.61213: filtering new block on tags 9733 1726773080.61238: done filtering new block on tags 9733 1726773080.61240: in VariableManager get_vars() 9733 1726773080.61254: done with get_vars() 9733 1726773080.61255: filtering new block on tags 9733 1726773080.61280: done filtering new block on tags 9733 1726773080.61282: in VariableManager get_vars() 9733 1726773080.61299: done with get_vars() 9733 1726773080.61300: filtering new block on tags 9733 1726773080.61315: done filtering new block on tags 9733 1726773080.61316: done iterating over new_blocks loaded from include file 9733 1726773080.61317: extending task lists for all hosts with included blocks 9733 1726773080.61446: done extending task lists 9733 1726773080.61447: done processing included files 9733 1726773080.61447: results queue empty 9733 1726773080.61447: checking for any_errors_fatal 9733 1726773080.61450: done checking for any_errors_fatal 9733 1726773080.61451: checking for max_fail_percentage 9733 1726773080.61451: done checking for max_fail_percentage 9733 1726773080.61451: checking to see if all hosts have failed and the running result is not ok 9733 1726773080.61452: done checking to see if all hosts have failed 9733 1726773080.61452: getting the remaining hosts for this loop 9733 1726773080.61453: done getting the remaining hosts for this loop 9733 1726773080.61454: getting the next task for host managed_node3 9733 1726773080.61457: done getting next task for host managed_node3 9733 1726773080.61459: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 9733 1726773080.61461: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773080.61468: getting variables 9733 1726773080.61469: in VariableManager get_vars() 9733 1726773080.61478: Calling all_inventory to load vars for managed_node3 9733 1726773080.61480: Calling groups_inventory to load vars for managed_node3 9733 1726773080.61481: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773080.61484: Calling all_plugins_play to load vars for managed_node3 9733 1726773080.61488: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773080.61490: Calling groups_plugins_play to load vars for managed_node3 9733 1726773080.61564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773080.61674: done with get_vars() 9733 1726773080.61681: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:2 Thursday 19 September 2024 15:11:20 -0400 (0:00:00.022) 0:00:26.349 **** 9733 1726773080.61731: entering _queue_task() for managed_node3/setup 9733 1726773080.61905: worker is 1 (out of 1 available) 9733 1726773080.61920: exiting _queue_task() for managed_node3/setup 9733 1726773080.61933: done queuing things up, now waiting for results queue to drain 9733 1726773080.61936: waiting for pending results... 10689 1726773080.62069: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 10689 1726773080.62209: in run() - task 0affffe7-6841-7dd6-8fa6-000000000754 10689 1726773080.62225: variable 'ansible_search_path' from source: unknown 10689 1726773080.62229: variable 'ansible_search_path' from source: unknown 10689 1726773080.62257: calling self._execute() 10689 1726773080.62388: variable 'ansible_host' from source: host vars for 'managed_node3' 10689 1726773080.62397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10689 1726773080.62406: variable 'omit' from source: magic vars 10689 1726773080.62774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10689 1726773080.64331: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10689 1726773080.64394: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10689 1726773080.64424: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10689 1726773080.64452: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10689 1726773080.64472: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10689 1726773080.64530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10689 1726773080.64550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10689 1726773080.64566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10689 1726773080.64605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10689 1726773080.64618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10689 1726773080.64657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10689 1726773080.64674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10689 1726773080.64697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10689 1726773080.64723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10689 1726773080.64734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10689 1726773080.64857: variable '__kernel_settings_required_facts' from source: role '' all vars 10689 1726773080.64869: variable 'ansible_facts' from source: unknown 10689 1726773080.64934: Evaluated conditional (__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 10689 1726773080.64939: when evaluation is False, skipping this task 10689 1726773080.64943: _execute() done 10689 1726773080.64946: dumping result to json 10689 1726773080.64950: done dumping result, returning 10689 1726773080.64957: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role [0affffe7-6841-7dd6-8fa6-000000000754] 10689 1726773080.64962: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000754 10689 1726773080.64990: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000754 10689 1726773080.64994: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } 9733 1726773080.65199: no more pending results, returning what we have 9733 1726773080.65202: results queue empty 9733 1726773080.65203: checking for any_errors_fatal 9733 1726773080.65204: done checking for any_errors_fatal 9733 1726773080.65205: checking for max_fail_percentage 9733 1726773080.65206: done checking for max_fail_percentage 9733 1726773080.65207: checking to see if all hosts have failed and the running result is not ok 9733 1726773080.65207: done checking to see if all hosts have failed 9733 1726773080.65208: getting the remaining hosts for this loop 9733 1726773080.65209: done getting the remaining hosts for this loop 9733 1726773080.65212: getting the next task for host managed_node3 9733 1726773080.65220: done getting next task for host managed_node3 9733 1726773080.65223: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 9733 1726773080.65228: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773080.65240: getting variables 9733 1726773080.65242: in VariableManager get_vars() 9733 1726773080.65269: Calling all_inventory to load vars for managed_node3 9733 1726773080.65271: Calling groups_inventory to load vars for managed_node3 9733 1726773080.65273: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773080.65281: Calling all_plugins_play to load vars for managed_node3 9733 1726773080.65283: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773080.65284: Calling groups_plugins_play to load vars for managed_node3 9733 1726773080.65394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773080.65524: done with get_vars() 9733 1726773080.65533: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if system is ostree] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:10 Thursday 19 September 2024 15:11:20 -0400 (0:00:00.038) 0:00:26.388 **** 9733 1726773080.65608: entering _queue_task() for managed_node3/stat 9733 1726773080.65779: worker is 1 (out of 1 available) 9733 1726773080.65796: exiting _queue_task() for managed_node3/stat 9733 1726773080.65810: done queuing things up, now waiting for results queue to drain 9733 1726773080.65812: waiting for pending results... 10690 1726773080.65945: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 10690 1726773080.66079: in run() - task 0affffe7-6841-7dd6-8fa6-000000000756 10690 1726773080.66098: variable 'ansible_search_path' from source: unknown 10690 1726773080.66102: variable 'ansible_search_path' from source: unknown 10690 1726773080.66129: calling self._execute() 10690 1726773080.66203: variable 'ansible_host' from source: host vars for 'managed_node3' 10690 1726773080.66210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10690 1726773080.66216: variable 'omit' from source: magic vars 10690 1726773080.66570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10690 1726773080.66761: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10690 1726773080.66828: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10690 1726773080.66856: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10690 1726773080.66883: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10690 1726773080.66947: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10690 1726773080.66968: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10690 1726773080.66992: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10690 1726773080.67011: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10690 1726773080.67099: variable '__kernel_settings_is_ostree' from source: set_fact 10690 1726773080.67110: Evaluated conditional (not __kernel_settings_is_ostree is defined): False 10690 1726773080.67115: when evaluation is False, skipping this task 10690 1726773080.67118: _execute() done 10690 1726773080.67121: dumping result to json 10690 1726773080.67125: done dumping result, returning 10690 1726773080.67130: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree [0affffe7-6841-7dd6-8fa6-000000000756] 10690 1726773080.67134: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000756 10690 1726773080.67154: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000756 10690 1726773080.67156: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __kernel_settings_is_ostree is defined", "skip_reason": "Conditional result was False" } 9733 1726773080.67419: no more pending results, returning what we have 9733 1726773080.67422: results queue empty 9733 1726773080.67423: checking for any_errors_fatal 9733 1726773080.67429: done checking for any_errors_fatal 9733 1726773080.67429: checking for max_fail_percentage 9733 1726773080.67430: done checking for max_fail_percentage 9733 1726773080.67431: checking to see if all hosts have failed and the running result is not ok 9733 1726773080.67431: done checking to see if all hosts have failed 9733 1726773080.67432: getting the remaining hosts for this loop 9733 1726773080.67433: done getting the remaining hosts for this loop 9733 1726773080.67435: getting the next task for host managed_node3 9733 1726773080.67440: done getting next task for host managed_node3 9733 1726773080.67442: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 9733 1726773080.67445: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773080.67456: getting variables 9733 1726773080.67457: in VariableManager get_vars() 9733 1726773080.67484: Calling all_inventory to load vars for managed_node3 9733 1726773080.67488: Calling groups_inventory to load vars for managed_node3 9733 1726773080.67490: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773080.67497: Calling all_plugins_play to load vars for managed_node3 9733 1726773080.67499: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773080.67500: Calling groups_plugins_play to load vars for managed_node3 9733 1726773080.67610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773080.67763: done with get_vars() 9733 1726773080.67772: done getting variables 9733 1726773080.67815: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:15 Thursday 19 September 2024 15:11:20 -0400 (0:00:00.022) 0:00:26.410 **** 9733 1726773080.67842: entering _queue_task() for managed_node3/set_fact 9733 1726773080.68018: worker is 1 (out of 1 available) 9733 1726773080.68033: exiting _queue_task() for managed_node3/set_fact 9733 1726773080.68046: done queuing things up, now waiting for results queue to drain 9733 1726773080.68048: waiting for pending results... 10691 1726773080.68182: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 10691 1726773080.68326: in run() - task 0affffe7-6841-7dd6-8fa6-000000000757 10691 1726773080.68342: variable 'ansible_search_path' from source: unknown 10691 1726773080.68346: variable 'ansible_search_path' from source: unknown 10691 1726773080.68375: calling self._execute() 10691 1726773080.68447: variable 'ansible_host' from source: host vars for 'managed_node3' 10691 1726773080.68453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10691 1726773080.68459: variable 'omit' from source: magic vars 10691 1726773080.68793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10691 1726773080.68974: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10691 1726773080.69011: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10691 1726773080.69038: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10691 1726773080.69065: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10691 1726773080.69128: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10691 1726773080.69150: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10691 1726773080.69170: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10691 1726773080.69194: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10691 1726773080.69276: variable '__kernel_settings_is_ostree' from source: set_fact 10691 1726773080.69295: Evaluated conditional (not __kernel_settings_is_ostree is defined): False 10691 1726773080.69300: when evaluation is False, skipping this task 10691 1726773080.69303: _execute() done 10691 1726773080.69307: dumping result to json 10691 1726773080.69311: done dumping result, returning 10691 1726773080.69317: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree [0affffe7-6841-7dd6-8fa6-000000000757] 10691 1726773080.69323: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000757 10691 1726773080.69345: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000757 10691 1726773080.69349: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __kernel_settings_is_ostree is defined", "skip_reason": "Conditional result was False" } 9733 1726773080.69455: no more pending results, returning what we have 9733 1726773080.69458: results queue empty 9733 1726773080.69459: checking for any_errors_fatal 9733 1726773080.69466: done checking for any_errors_fatal 9733 1726773080.69466: checking for max_fail_percentage 9733 1726773080.69468: done checking for max_fail_percentage 9733 1726773080.69468: checking to see if all hosts have failed and the running result is not ok 9733 1726773080.69469: done checking to see if all hosts have failed 9733 1726773080.69469: getting the remaining hosts for this loop 9733 1726773080.69471: done getting the remaining hosts for this loop 9733 1726773080.69474: getting the next task for host managed_node3 9733 1726773080.69482: done getting next task for host managed_node3 9733 1726773080.69488: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 9733 1726773080.69492: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773080.69507: getting variables 9733 1726773080.69508: in VariableManager get_vars() 9733 1726773080.69542: Calling all_inventory to load vars for managed_node3 9733 1726773080.69545: Calling groups_inventory to load vars for managed_node3 9733 1726773080.69547: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773080.69555: Calling all_plugins_play to load vars for managed_node3 9733 1726773080.69559: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773080.69561: Calling groups_plugins_play to load vars for managed_node3 9733 1726773080.69670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773080.69794: done with get_vars() 9733 1726773080.69804: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:22 Thursday 19 September 2024 15:11:20 -0400 (0:00:00.020) 0:00:26.431 **** 9733 1726773080.69869: entering _queue_task() for managed_node3/stat 9733 1726773080.70043: worker is 1 (out of 1 available) 9733 1726773080.70057: exiting _queue_task() for managed_node3/stat 9733 1726773080.70069: done queuing things up, now waiting for results queue to drain 9733 1726773080.70071: waiting for pending results... 10692 1726773080.70204: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 10692 1726773080.70340: in run() - task 0affffe7-6841-7dd6-8fa6-000000000759 10692 1726773080.70357: variable 'ansible_search_path' from source: unknown 10692 1726773080.70362: variable 'ansible_search_path' from source: unknown 10692 1726773080.70392: calling self._execute() 10692 1726773080.70461: variable 'ansible_host' from source: host vars for 'managed_node3' 10692 1726773080.70470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10692 1726773080.70479: variable 'omit' from source: magic vars 10692 1726773080.70812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10692 1726773080.71052: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10692 1726773080.71092: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10692 1726773080.71118: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10692 1726773080.71144: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10692 1726773080.71209: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10692 1726773080.71229: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10692 1726773080.71248: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10692 1726773080.71266: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10692 1726773080.71352: variable '__kernel_settings_is_transactional' from source: set_fact 10692 1726773080.71363: Evaluated conditional (not __kernel_settings_is_transactional is defined): False 10692 1726773080.71368: when evaluation is False, skipping this task 10692 1726773080.71371: _execute() done 10692 1726773080.71374: dumping result to json 10692 1726773080.71378: done dumping result, returning 10692 1726773080.71392: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin [0affffe7-6841-7dd6-8fa6-000000000759] 10692 1726773080.71400: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000759 10692 1726773080.71423: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000759 10692 1726773080.71426: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __kernel_settings_is_transactional is defined", "skip_reason": "Conditional result was False" } 9733 1726773080.71540: no more pending results, returning what we have 9733 1726773080.71543: results queue empty 9733 1726773080.71544: checking for any_errors_fatal 9733 1726773080.71550: done checking for any_errors_fatal 9733 1726773080.71550: checking for max_fail_percentage 9733 1726773080.71552: done checking for max_fail_percentage 9733 1726773080.71552: checking to see if all hosts have failed and the running result is not ok 9733 1726773080.71553: done checking to see if all hosts have failed 9733 1726773080.71553: getting the remaining hosts for this loop 9733 1726773080.71554: done getting the remaining hosts for this loop 9733 1726773080.71558: getting the next task for host managed_node3 9733 1726773080.71565: done getting next task for host managed_node3 9733 1726773080.71568: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 9733 1726773080.71572: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773080.71589: getting variables 9733 1726773080.71590: in VariableManager get_vars() 9733 1726773080.71624: Calling all_inventory to load vars for managed_node3 9733 1726773080.71627: Calling groups_inventory to load vars for managed_node3 9733 1726773080.71628: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773080.71637: Calling all_plugins_play to load vars for managed_node3 9733 1726773080.71639: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773080.71642: Calling groups_plugins_play to load vars for managed_node3 9733 1726773080.71793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773080.71912: done with get_vars() 9733 1726773080.71922: done getting variables 9733 1726773080.71962: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:27 Thursday 19 September 2024 15:11:20 -0400 (0:00:00.021) 0:00:26.452 **** 9733 1726773080.71989: entering _queue_task() for managed_node3/set_fact 9733 1726773080.72153: worker is 1 (out of 1 available) 9733 1726773080.72168: exiting _queue_task() for managed_node3/set_fact 9733 1726773080.72180: done queuing things up, now waiting for results queue to drain 9733 1726773080.72182: waiting for pending results... 10693 1726773080.72307: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 10693 1726773080.72436: in run() - task 0affffe7-6841-7dd6-8fa6-00000000075a 10693 1726773080.72452: variable 'ansible_search_path' from source: unknown 10693 1726773080.72456: variable 'ansible_search_path' from source: unknown 10693 1726773080.72483: calling self._execute() 10693 1726773080.72555: variable 'ansible_host' from source: host vars for 'managed_node3' 10693 1726773080.72565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10693 1726773080.72574: variable 'omit' from source: magic vars 10693 1726773080.72901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10693 1726773080.73078: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10693 1726773080.73114: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10693 1726773080.73140: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10693 1726773080.73169: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10693 1726773080.73233: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10693 1726773080.73253: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10693 1726773080.73272: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10693 1726773080.73296: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10693 1726773080.73377: variable '__kernel_settings_is_transactional' from source: set_fact 10693 1726773080.73391: Evaluated conditional (not __kernel_settings_is_transactional is defined): False 10693 1726773080.73396: when evaluation is False, skipping this task 10693 1726773080.73399: _execute() done 10693 1726773080.73401: dumping result to json 10693 1726773080.73403: done dumping result, returning 10693 1726773080.73407: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists [0affffe7-6841-7dd6-8fa6-00000000075a] 10693 1726773080.73411: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000075a 10693 1726773080.73430: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000075a 10693 1726773080.73432: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __kernel_settings_is_transactional is defined", "skip_reason": "Conditional result was False" } 9733 1726773080.73583: no more pending results, returning what we have 9733 1726773080.73589: results queue empty 9733 1726773080.73590: checking for any_errors_fatal 9733 1726773080.73596: done checking for any_errors_fatal 9733 1726773080.73597: checking for max_fail_percentage 9733 1726773080.73598: done checking for max_fail_percentage 9733 1726773080.73599: checking to see if all hosts have failed and the running result is not ok 9733 1726773080.73599: done checking to see if all hosts have failed 9733 1726773080.73600: getting the remaining hosts for this loop 9733 1726773080.73601: done getting the remaining hosts for this loop 9733 1726773080.73604: getting the next task for host managed_node3 9733 1726773080.73614: done getting next task for host managed_node3 9733 1726773080.73618: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 9733 1726773080.73622: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773080.73636: getting variables 9733 1726773080.73638: in VariableManager get_vars() 9733 1726773080.73670: Calling all_inventory to load vars for managed_node3 9733 1726773080.73673: Calling groups_inventory to load vars for managed_node3 9733 1726773080.73674: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773080.73683: Calling all_plugins_play to load vars for managed_node3 9733 1726773080.73690: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773080.73692: Calling groups_plugins_play to load vars for managed_node3 9733 1726773080.73796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773080.73922: done with get_vars() 9733 1726773080.73929: done getting variables 9733 1726773080.73970: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set platform/version specific variables] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:31 Thursday 19 September 2024 15:11:20 -0400 (0:00:00.020) 0:00:26.472 **** 9733 1726773080.73998: entering _queue_task() for managed_node3/include_vars 9733 1726773080.74157: worker is 1 (out of 1 available) 9733 1726773080.74172: exiting _queue_task() for managed_node3/include_vars 9733 1726773080.74184: done queuing things up, now waiting for results queue to drain 9733 1726773080.74190: waiting for pending results... 10694 1726773080.74315: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 10694 1726773080.74447: in run() - task 0affffe7-6841-7dd6-8fa6-00000000075c 10694 1726773080.74463: variable 'ansible_search_path' from source: unknown 10694 1726773080.74467: variable 'ansible_search_path' from source: unknown 10694 1726773080.74496: calling self._execute() 10694 1726773080.74563: variable 'ansible_host' from source: host vars for 'managed_node3' 10694 1726773080.74572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10694 1726773080.74581: variable 'omit' from source: magic vars 10694 1726773080.74652: variable 'omit' from source: magic vars 10694 1726773080.74703: variable 'omit' from source: magic vars 10694 1726773080.74960: variable 'ffparams' from source: task vars 10694 1726773080.75113: variable 'ansible_facts' from source: unknown 10694 1726773080.75237: variable 'ansible_facts' from source: unknown 10694 1726773080.75324: variable 'ansible_facts' from source: unknown 10694 1726773080.75411: variable 'ansible_facts' from source: unknown 10694 1726773080.75487: variable 'role_path' from source: magic vars 10694 1726773080.75613: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 10694 1726773080.75751: Loaded config def from plugin (lookup/first_found) 10694 1726773080.75759: Loading LookupModule 'first_found' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/first_found.py 10694 1726773080.75790: variable 'ansible_search_path' from source: unknown 10694 1726773080.75809: variable 'ansible_search_path' from source: unknown 10694 1726773080.75818: variable 'ansible_search_path' from source: unknown 10694 1726773080.75825: variable 'ansible_search_path' from source: unknown 10694 1726773080.75833: variable 'ansible_search_path' from source: unknown 10694 1726773080.75848: variable 'omit' from source: magic vars 10694 1726773080.75866: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10694 1726773080.75886: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10694 1726773080.75904: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10694 1726773080.75918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10694 1726773080.75927: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10694 1726773080.75949: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10694 1726773080.75954: variable 'ansible_host' from source: host vars for 'managed_node3' 10694 1726773080.75959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10694 1726773080.76024: Set connection var ansible_timeout to 10 10694 1726773080.76030: Set connection var ansible_shell_type to sh 10694 1726773080.76036: Set connection var ansible_module_compression to ZIP_DEFLATED 10694 1726773080.76041: Set connection var ansible_shell_executable to /bin/sh 10694 1726773080.76046: Set connection var ansible_pipelining to False 10694 1726773080.76051: Set connection var ansible_connection to ssh 10694 1726773080.76064: variable 'ansible_shell_executable' from source: unknown 10694 1726773080.76067: variable 'ansible_connection' from source: unknown 10694 1726773080.76069: variable 'ansible_module_compression' from source: unknown 10694 1726773080.76070: variable 'ansible_shell_type' from source: unknown 10694 1726773080.76072: variable 'ansible_shell_executable' from source: unknown 10694 1726773080.76074: variable 'ansible_host' from source: host vars for 'managed_node3' 10694 1726773080.76076: variable 'ansible_pipelining' from source: unknown 10694 1726773080.76077: variable 'ansible_timeout' from source: unknown 10694 1726773080.76079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10694 1726773080.76161: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10694 1726773080.76172: variable 'omit' from source: magic vars 10694 1726773080.76178: starting attempt loop 10694 1726773080.76182: running the handler 10694 1726773080.76227: handler run complete 10694 1726773080.76237: attempt loop complete, returning result 10694 1726773080.76240: _execute() done 10694 1726773080.76244: dumping result to json 10694 1726773080.76248: done dumping result, returning 10694 1726773080.76255: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables [0affffe7-6841-7dd6-8fa6-00000000075c] 10694 1726773080.76261: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000075c 10694 1726773080.76283: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000075c 10694 1726773080.76288: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__kernel_settings_packages": [ "tuned", "python3-configobj" ], "__kernel_settings_services": [ "tuned" ] }, "ansible_included_var_files": [ "/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/default.yml" ], "changed": false } 9733 1726773080.76440: no more pending results, returning what we have 9733 1726773080.76443: results queue empty 9733 1726773080.76444: checking for any_errors_fatal 9733 1726773080.76448: done checking for any_errors_fatal 9733 1726773080.76449: checking for max_fail_percentage 9733 1726773080.76450: done checking for max_fail_percentage 9733 1726773080.76451: checking to see if all hosts have failed and the running result is not ok 9733 1726773080.76451: done checking to see if all hosts have failed 9733 1726773080.76452: getting the remaining hosts for this loop 9733 1726773080.76453: done getting the remaining hosts for this loop 9733 1726773080.76456: getting the next task for host managed_node3 9733 1726773080.76463: done getting next task for host managed_node3 9733 1726773080.76467: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 9733 1726773080.76470: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773080.76482: getting variables 9733 1726773080.76483: in VariableManager get_vars() 9733 1726773080.76520: Calling all_inventory to load vars for managed_node3 9733 1726773080.76523: Calling groups_inventory to load vars for managed_node3 9733 1726773080.76525: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773080.76538: Calling all_plugins_play to load vars for managed_node3 9733 1726773080.76540: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773080.76542: Calling groups_plugins_play to load vars for managed_node3 9733 1726773080.76693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773080.76815: done with get_vars() 9733 1726773080.76824: done getting variables 9733 1726773080.76865: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required packages are installed] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 Thursday 19 September 2024 15:11:20 -0400 (0:00:00.028) 0:00:26.501 **** 9733 1726773080.76892: entering _queue_task() for managed_node3/package 9733 1726773080.77067: worker is 1 (out of 1 available) 9733 1726773080.77084: exiting _queue_task() for managed_node3/package 9733 1726773080.77101: done queuing things up, now waiting for results queue to drain 9733 1726773080.77103: waiting for pending results... 10695 1726773080.77229: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 10695 1726773080.77355: in run() - task 0affffe7-6841-7dd6-8fa6-0000000006da 10695 1726773080.77372: variable 'ansible_search_path' from source: unknown 10695 1726773080.77376: variable 'ansible_search_path' from source: unknown 10695 1726773080.77406: calling self._execute() 10695 1726773080.77476: variable 'ansible_host' from source: host vars for 'managed_node3' 10695 1726773080.77486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10695 1726773080.77495: variable 'omit' from source: magic vars 10695 1726773080.77573: variable 'omit' from source: magic vars 10695 1726773080.77615: variable 'omit' from source: magic vars 10695 1726773080.77638: variable '__kernel_settings_packages' from source: include_vars 10695 1726773080.77855: variable '__kernel_settings_packages' from source: include_vars 10695 1726773080.78014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10695 1726773080.79514: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10695 1726773080.79569: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10695 1726773080.79601: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10695 1726773080.79627: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10695 1726773080.79648: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10695 1726773080.79719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10695 1726773080.79740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10695 1726773080.79758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10695 1726773080.79786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10695 1726773080.79801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10695 1726773080.79872: variable '__kernel_settings_is_ostree' from source: set_fact 10695 1726773080.79879: variable 'omit' from source: magic vars 10695 1726773080.79904: variable 'omit' from source: magic vars 10695 1726773080.79925: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10695 1726773080.79942: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10695 1726773080.79954: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10695 1726773080.79966: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10695 1726773080.79973: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10695 1726773080.79998: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10695 1726773080.80004: variable 'ansible_host' from source: host vars for 'managed_node3' 10695 1726773080.80008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10695 1726773080.80075: Set connection var ansible_timeout to 10 10695 1726773080.80080: Set connection var ansible_shell_type to sh 10695 1726773080.80088: Set connection var ansible_module_compression to ZIP_DEFLATED 10695 1726773080.80094: Set connection var ansible_shell_executable to /bin/sh 10695 1726773080.80100: Set connection var ansible_pipelining to False 10695 1726773080.80106: Set connection var ansible_connection to ssh 10695 1726773080.80123: variable 'ansible_shell_executable' from source: unknown 10695 1726773080.80128: variable 'ansible_connection' from source: unknown 10695 1726773080.80130: variable 'ansible_module_compression' from source: unknown 10695 1726773080.80132: variable 'ansible_shell_type' from source: unknown 10695 1726773080.80134: variable 'ansible_shell_executable' from source: unknown 10695 1726773080.80135: variable 'ansible_host' from source: host vars for 'managed_node3' 10695 1726773080.80137: variable 'ansible_pipelining' from source: unknown 10695 1726773080.80139: variable 'ansible_timeout' from source: unknown 10695 1726773080.80141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10695 1726773080.80208: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10695 1726773080.80220: variable 'omit' from source: magic vars 10695 1726773080.80227: starting attempt loop 10695 1726773080.80230: running the handler 10695 1726773080.80292: variable 'ansible_facts' from source: unknown 10695 1726773080.80373: _low_level_execute_command(): starting 10695 1726773080.80382: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10695 1726773080.82778: stdout chunk (state=2): >>>/root <<< 10695 1726773080.82900: stderr chunk (state=3): >>><<< 10695 1726773080.82908: stdout chunk (state=3): >>><<< 10695 1726773080.82931: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10695 1726773080.82944: _low_level_execute_command(): starting 10695 1726773080.82950: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773080.8293977-10695-106784395046480 `" && echo ansible-tmp-1726773080.8293977-10695-106784395046480="` echo /root/.ansible/tmp/ansible-tmp-1726773080.8293977-10695-106784395046480 `" ) && sleep 0' 10695 1726773080.85531: stdout chunk (state=2): >>>ansible-tmp-1726773080.8293977-10695-106784395046480=/root/.ansible/tmp/ansible-tmp-1726773080.8293977-10695-106784395046480 <<< 10695 1726773080.85670: stderr chunk (state=3): >>><<< 10695 1726773080.85677: stdout chunk (state=3): >>><<< 10695 1726773080.85695: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773080.8293977-10695-106784395046480=/root/.ansible/tmp/ansible-tmp-1726773080.8293977-10695-106784395046480 , stderr= 10695 1726773080.85723: variable 'ansible_module_compression' from source: unknown 10695 1726773080.85769: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 10695 1726773080.85806: variable 'ansible_facts' from source: unknown 10695 1726773080.85898: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773080.8293977-10695-106784395046480/AnsiballZ_dnf.py 10695 1726773080.86006: Sending initial data 10695 1726773080.86013: Sent initial data (151 bytes) 10695 1726773080.88629: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpa57dw4e6 /root/.ansible/tmp/ansible-tmp-1726773080.8293977-10695-106784395046480/AnsiballZ_dnf.py <<< 10695 1726773080.90386: stderr chunk (state=3): >>><<< 10695 1726773080.90397: stdout chunk (state=3): >>><<< 10695 1726773080.90417: done transferring module to remote 10695 1726773080.90429: _low_level_execute_command(): starting 10695 1726773080.90434: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773080.8293977-10695-106784395046480/ /root/.ansible/tmp/ansible-tmp-1726773080.8293977-10695-106784395046480/AnsiballZ_dnf.py && sleep 0' 10695 1726773080.92929: stderr chunk (state=2): >>><<< 10695 1726773080.92942: stdout chunk (state=2): >>><<< 10695 1726773080.92961: _low_level_execute_command() done: rc=0, stdout=, stderr= 10695 1726773080.92967: _low_level_execute_command(): starting 10695 1726773080.92973: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773080.8293977-10695-106784395046480/AnsiballZ_dnf.py && sleep 0' 10695 1726773083.49277: stdout chunk (state=2): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} <<< 10695 1726773083.57324: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10695 1726773083.57374: stderr chunk (state=3): >>><<< 10695 1726773083.57381: stdout chunk (state=3): >>><<< 10695 1726773083.57400: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} , stderr=Shared connection to 10.31.47.99 closed. 10695 1726773083.57435: done with _execute_module (ansible.legacy.dnf, {'name': ['tuned', 'python3-configobj'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773080.8293977-10695-106784395046480/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10695 1726773083.57444: _low_level_execute_command(): starting 10695 1726773083.57450: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773080.8293977-10695-106784395046480/ > /dev/null 2>&1 && sleep 0' 10695 1726773083.59921: stderr chunk (state=2): >>><<< 10695 1726773083.59933: stdout chunk (state=2): >>><<< 10695 1726773083.59946: _low_level_execute_command() done: rc=0, stdout=, stderr= 10695 1726773083.59952: handler run complete 10695 1726773083.59976: attempt loop complete, returning result 10695 1726773083.59979: _execute() done 10695 1726773083.59981: dumping result to json 10695 1726773083.59986: done dumping result, returning 10695 1726773083.59995: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed [0affffe7-6841-7dd6-8fa6-0000000006da] 10695 1726773083.60004: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006da 10695 1726773083.60035: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006da 10695 1726773083.60038: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 9733 1726773083.60204: no more pending results, returning what we have 9733 1726773083.60207: results queue empty 9733 1726773083.60207: checking for any_errors_fatal 9733 1726773083.60216: done checking for any_errors_fatal 9733 1726773083.60217: checking for max_fail_percentage 9733 1726773083.60218: done checking for max_fail_percentage 9733 1726773083.60219: checking to see if all hosts have failed and the running result is not ok 9733 1726773083.60220: done checking to see if all hosts have failed 9733 1726773083.60220: getting the remaining hosts for this loop 9733 1726773083.60221: done getting the remaining hosts for this loop 9733 1726773083.60224: getting the next task for host managed_node3 9733 1726773083.60233: done getting next task for host managed_node3 9733 1726773083.60236: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 9733 1726773083.60239: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773083.60249: getting variables 9733 1726773083.60250: in VariableManager get_vars() 9733 1726773083.60286: Calling all_inventory to load vars for managed_node3 9733 1726773083.60289: Calling groups_inventory to load vars for managed_node3 9733 1726773083.60291: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773083.60300: Calling all_plugins_play to load vars for managed_node3 9733 1726773083.60303: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773083.60306: Calling groups_plugins_play to load vars for managed_node3 9733 1726773083.60422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773083.60547: done with get_vars() 9733 1726773083.60557: done getting variables 9733 1726773083.60604: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:24 Thursday 19 September 2024 15:11:23 -0400 (0:00:02.837) 0:00:29.338 **** 9733 1726773083.60631: entering _queue_task() for managed_node3/debug 9733 1726773083.60806: worker is 1 (out of 1 available) 9733 1726773083.60821: exiting _queue_task() for managed_node3/debug 9733 1726773083.60833: done queuing things up, now waiting for results queue to drain 9733 1726773083.60835: waiting for pending results... 10752 1726773083.60975: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 10752 1726773083.61105: in run() - task 0affffe7-6841-7dd6-8fa6-0000000006dc 10752 1726773083.61122: variable 'ansible_search_path' from source: unknown 10752 1726773083.61127: variable 'ansible_search_path' from source: unknown 10752 1726773083.61155: calling self._execute() 10752 1726773083.61235: variable 'ansible_host' from source: host vars for 'managed_node3' 10752 1726773083.61245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10752 1726773083.61253: variable 'omit' from source: magic vars 10752 1726773083.61604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10752 1726773083.63176: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10752 1726773083.63228: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10752 1726773083.63257: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10752 1726773083.63287: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10752 1726773083.63310: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10752 1726773083.63366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10752 1726773083.63399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10752 1726773083.63421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10752 1726773083.63449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10752 1726773083.63461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10752 1726773083.63541: variable '__kernel_settings_is_transactional' from source: set_fact 10752 1726773083.63556: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 10752 1726773083.63559: when evaluation is False, skipping this task 10752 1726773083.63561: _execute() done 10752 1726773083.63563: dumping result to json 10752 1726773083.63565: done dumping result, returning 10752 1726773083.63569: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes [0affffe7-6841-7dd6-8fa6-0000000006dc] 10752 1726773083.63574: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006dc 10752 1726773083.63595: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006dc 10752 1726773083.63597: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "__kernel_settings_is_transactional | d(false)" } 9733 1726773083.63808: no more pending results, returning what we have 9733 1726773083.63811: results queue empty 9733 1726773083.63811: checking for any_errors_fatal 9733 1726773083.63817: done checking for any_errors_fatal 9733 1726773083.63818: checking for max_fail_percentage 9733 1726773083.63819: done checking for max_fail_percentage 9733 1726773083.63819: checking to see if all hosts have failed and the running result is not ok 9733 1726773083.63820: done checking to see if all hosts have failed 9733 1726773083.63820: getting the remaining hosts for this loop 9733 1726773083.63821: done getting the remaining hosts for this loop 9733 1726773083.63823: getting the next task for host managed_node3 9733 1726773083.63828: done getting next task for host managed_node3 9733 1726773083.63830: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 9733 1726773083.63833: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773083.63844: getting variables 9733 1726773083.63845: in VariableManager get_vars() 9733 1726773083.63871: Calling all_inventory to load vars for managed_node3 9733 1726773083.63873: Calling groups_inventory to load vars for managed_node3 9733 1726773083.63874: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773083.63884: Calling all_plugins_play to load vars for managed_node3 9733 1726773083.63886: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773083.63889: Calling groups_plugins_play to load vars for managed_node3 9733 1726773083.63998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773083.64165: done with get_vars() 9733 1726773083.64172: done getting variables 9733 1726773083.64216: Loading ActionModule 'reboot' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Reboot transactional update systems] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:29 Thursday 19 September 2024 15:11:23 -0400 (0:00:00.036) 0:00:29.374 **** 9733 1726773083.64239: entering _queue_task() for managed_node3/reboot 9733 1726773083.64414: worker is 1 (out of 1 available) 9733 1726773083.64430: exiting _queue_task() for managed_node3/reboot 9733 1726773083.64442: done queuing things up, now waiting for results queue to drain 9733 1726773083.64444: waiting for pending results... 10753 1726773083.64580: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 10753 1726773083.64713: in run() - task 0affffe7-6841-7dd6-8fa6-0000000006dd 10753 1726773083.64730: variable 'ansible_search_path' from source: unknown 10753 1726773083.64734: variable 'ansible_search_path' from source: unknown 10753 1726773083.64763: calling self._execute() 10753 1726773083.64840: variable 'ansible_host' from source: host vars for 'managed_node3' 10753 1726773083.64849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10753 1726773083.64858: variable 'omit' from source: magic vars 10753 1726773083.65214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10753 1726773083.66750: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10753 1726773083.66811: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10753 1726773083.66841: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10753 1726773083.66868: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10753 1726773083.66890: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10753 1726773083.66946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10753 1726773083.66964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10753 1726773083.66979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10753 1726773083.67010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10753 1726773083.67020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10753 1726773083.67096: variable '__kernel_settings_is_transactional' from source: set_fact 10753 1726773083.67113: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 10753 1726773083.67117: when evaluation is False, skipping this task 10753 1726773083.67119: _execute() done 10753 1726773083.67121: dumping result to json 10753 1726773083.67123: done dumping result, returning 10753 1726773083.67127: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems [0affffe7-6841-7dd6-8fa6-0000000006dd] 10753 1726773083.67131: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006dd 10753 1726773083.67151: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006dd 10753 1726773083.67154: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 9733 1726773083.67441: no more pending results, returning what we have 9733 1726773083.67444: results queue empty 9733 1726773083.67444: checking for any_errors_fatal 9733 1726773083.67449: done checking for any_errors_fatal 9733 1726773083.67449: checking for max_fail_percentage 9733 1726773083.67450: done checking for max_fail_percentage 9733 1726773083.67451: checking to see if all hosts have failed and the running result is not ok 9733 1726773083.67451: done checking to see if all hosts have failed 9733 1726773083.67452: getting the remaining hosts for this loop 9733 1726773083.67452: done getting the remaining hosts for this loop 9733 1726773083.67455: getting the next task for host managed_node3 9733 1726773083.67459: done getting next task for host managed_node3 9733 1726773083.67462: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 9733 1726773083.67464: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773083.67475: getting variables 9733 1726773083.67476: in VariableManager get_vars() 9733 1726773083.67506: Calling all_inventory to load vars for managed_node3 9733 1726773083.67508: Calling groups_inventory to load vars for managed_node3 9733 1726773083.67509: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773083.67516: Calling all_plugins_play to load vars for managed_node3 9733 1726773083.67518: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773083.67519: Calling groups_plugins_play to load vars for managed_node3 9733 1726773083.67633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773083.67756: done with get_vars() 9733 1726773083.67765: done getting variables 9733 1726773083.67810: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:34 Thursday 19 September 2024 15:11:23 -0400 (0:00:00.035) 0:00:29.410 **** 9733 1726773083.67834: entering _queue_task() for managed_node3/fail 9733 1726773083.68008: worker is 1 (out of 1 available) 9733 1726773083.68024: exiting _queue_task() for managed_node3/fail 9733 1726773083.68037: done queuing things up, now waiting for results queue to drain 9733 1726773083.68039: waiting for pending results... 10754 1726773083.68178: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 10754 1726773083.68313: in run() - task 0affffe7-6841-7dd6-8fa6-0000000006de 10754 1726773083.68331: variable 'ansible_search_path' from source: unknown 10754 1726773083.68335: variable 'ansible_search_path' from source: unknown 10754 1726773083.68364: calling self._execute() 10754 1726773083.68437: variable 'ansible_host' from source: host vars for 'managed_node3' 10754 1726773083.68445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10754 1726773083.68452: variable 'omit' from source: magic vars 10754 1726773083.68798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10754 1726773083.70358: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10754 1726773083.70411: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10754 1726773083.70440: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10754 1726773083.70469: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10754 1726773083.70491: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10754 1726773083.70549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10754 1726773083.70802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10754 1726773083.70823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10754 1726773083.70851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10754 1726773083.70862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10754 1726773083.70943: variable '__kernel_settings_is_transactional' from source: set_fact 10754 1726773083.70959: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 10754 1726773083.70964: when evaluation is False, skipping this task 10754 1726773083.70968: _execute() done 10754 1726773083.70971: dumping result to json 10754 1726773083.70975: done dumping result, returning 10754 1726773083.70982: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set [0affffe7-6841-7dd6-8fa6-0000000006de] 10754 1726773083.70989: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006de 10754 1726773083.71015: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006de 10754 1726773083.71017: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 9733 1726773083.71236: no more pending results, returning what we have 9733 1726773083.71238: results queue empty 9733 1726773083.71239: checking for any_errors_fatal 9733 1726773083.71245: done checking for any_errors_fatal 9733 1726773083.71246: checking for max_fail_percentage 9733 1726773083.71247: done checking for max_fail_percentage 9733 1726773083.71248: checking to see if all hosts have failed and the running result is not ok 9733 1726773083.71248: done checking to see if all hosts have failed 9733 1726773083.71249: getting the remaining hosts for this loop 9733 1726773083.71250: done getting the remaining hosts for this loop 9733 1726773083.71253: getting the next task for host managed_node3 9733 1726773083.71261: done getting next task for host managed_node3 9733 1726773083.71264: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 9733 1726773083.71268: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773083.71281: getting variables 9733 1726773083.71283: in VariableManager get_vars() 9733 1726773083.71316: Calling all_inventory to load vars for managed_node3 9733 1726773083.71318: Calling groups_inventory to load vars for managed_node3 9733 1726773083.71320: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773083.71327: Calling all_plugins_play to load vars for managed_node3 9733 1726773083.71329: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773083.71330: Calling groups_plugins_play to load vars for managed_node3 9733 1726773083.71481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773083.71603: done with get_vars() 9733 1726773083.71611: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Read tuned main config] ****** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42 Thursday 19 September 2024 15:11:23 -0400 (0:00:00.038) 0:00:29.449 **** 9733 1726773083.71671: entering _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config 9733 1726773083.71846: worker is 1 (out of 1 available) 9733 1726773083.71861: exiting _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config 9733 1726773083.71873: done queuing things up, now waiting for results queue to drain 9733 1726773083.71876: waiting for pending results... 10755 1726773083.72019: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 10755 1726773083.72143: in run() - task 0affffe7-6841-7dd6-8fa6-0000000006e0 10755 1726773083.72161: variable 'ansible_search_path' from source: unknown 10755 1726773083.72165: variable 'ansible_search_path' from source: unknown 10755 1726773083.72196: calling self._execute() 10755 1726773083.72267: variable 'ansible_host' from source: host vars for 'managed_node3' 10755 1726773083.72276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10755 1726773083.72286: variable 'omit' from source: magic vars 10755 1726773083.72360: variable 'omit' from source: magic vars 10755 1726773083.72405: variable 'omit' from source: magic vars 10755 1726773083.72427: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 10755 1726773083.72645: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 10755 1726773083.72708: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10755 1726773083.72736: variable 'omit' from source: magic vars 10755 1726773083.72768: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10755 1726773083.72797: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10755 1726773083.72816: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10755 1726773083.72830: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10755 1726773083.72841: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10755 1726773083.72865: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10755 1726773083.72870: variable 'ansible_host' from source: host vars for 'managed_node3' 10755 1726773083.72874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10755 1726773083.72948: Set connection var ansible_timeout to 10 10755 1726773083.72953: Set connection var ansible_shell_type to sh 10755 1726773083.72959: Set connection var ansible_module_compression to ZIP_DEFLATED 10755 1726773083.72964: Set connection var ansible_shell_executable to /bin/sh 10755 1726773083.72970: Set connection var ansible_pipelining to False 10755 1726773083.72977: Set connection var ansible_connection to ssh 10755 1726773083.72995: variable 'ansible_shell_executable' from source: unknown 10755 1726773083.73002: variable 'ansible_connection' from source: unknown 10755 1726773083.73006: variable 'ansible_module_compression' from source: unknown 10755 1726773083.73010: variable 'ansible_shell_type' from source: unknown 10755 1726773083.73013: variable 'ansible_shell_executable' from source: unknown 10755 1726773083.73017: variable 'ansible_host' from source: host vars for 'managed_node3' 10755 1726773083.73021: variable 'ansible_pipelining' from source: unknown 10755 1726773083.73024: variable 'ansible_timeout' from source: unknown 10755 1726773083.73028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10755 1726773083.73156: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10755 1726773083.73168: variable 'omit' from source: magic vars 10755 1726773083.73174: starting attempt loop 10755 1726773083.73178: running the handler 10755 1726773083.73191: _low_level_execute_command(): starting 10755 1726773083.73199: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10755 1726773083.75575: stdout chunk (state=2): >>>/root <<< 10755 1726773083.75699: stderr chunk (state=3): >>><<< 10755 1726773083.75707: stdout chunk (state=3): >>><<< 10755 1726773083.75724: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10755 1726773083.75737: _low_level_execute_command(): starting 10755 1726773083.75743: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773083.7573175-10755-59263241944398 `" && echo ansible-tmp-1726773083.7573175-10755-59263241944398="` echo /root/.ansible/tmp/ansible-tmp-1726773083.7573175-10755-59263241944398 `" ) && sleep 0' 10755 1726773083.78226: stdout chunk (state=2): >>>ansible-tmp-1726773083.7573175-10755-59263241944398=/root/.ansible/tmp/ansible-tmp-1726773083.7573175-10755-59263241944398 <<< 10755 1726773083.78354: stderr chunk (state=3): >>><<< 10755 1726773083.78360: stdout chunk (state=3): >>><<< 10755 1726773083.78374: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773083.7573175-10755-59263241944398=/root/.ansible/tmp/ansible-tmp-1726773083.7573175-10755-59263241944398 , stderr= 10755 1726773083.78412: variable 'ansible_module_compression' from source: unknown 10755 1726773083.78446: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 10755 1726773083.78475: variable 'ansible_facts' from source: unknown 10755 1726773083.78543: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773083.7573175-10755-59263241944398/AnsiballZ_kernel_settings_get_config.py 10755 1726773083.78643: Sending initial data 10755 1726773083.78650: Sent initial data (173 bytes) 10755 1726773083.81225: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpr8y5pdus /root/.ansible/tmp/ansible-tmp-1726773083.7573175-10755-59263241944398/AnsiballZ_kernel_settings_get_config.py <<< 10755 1726773083.82380: stderr chunk (state=3): >>><<< 10755 1726773083.82391: stdout chunk (state=3): >>><<< 10755 1726773083.82413: done transferring module to remote 10755 1726773083.82424: _low_level_execute_command(): starting 10755 1726773083.82429: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773083.7573175-10755-59263241944398/ /root/.ansible/tmp/ansible-tmp-1726773083.7573175-10755-59263241944398/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10755 1726773083.84894: stderr chunk (state=2): >>><<< 10755 1726773083.84906: stdout chunk (state=2): >>><<< 10755 1726773083.84923: _low_level_execute_command() done: rc=0, stdout=, stderr= 10755 1726773083.84927: _low_level_execute_command(): starting 10755 1726773083.84933: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773083.7573175-10755-59263241944398/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10755 1726773084.00584: stdout chunk (state=2): >>> {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} <<< 10755 1726773084.01632: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10755 1726773084.01681: stderr chunk (state=3): >>><<< 10755 1726773084.01689: stdout chunk (state=3): >>><<< 10755 1726773084.01708: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} , stderr=Shared connection to 10.31.47.99 closed. 10755 1726773084.01737: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/tuned-main.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773083.7573175-10755-59263241944398/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10755 1726773084.01748: _low_level_execute_command(): starting 10755 1726773084.01754: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773083.7573175-10755-59263241944398/ > /dev/null 2>&1 && sleep 0' 10755 1726773084.04206: stderr chunk (state=2): >>><<< 10755 1726773084.04216: stdout chunk (state=2): >>><<< 10755 1726773084.04233: _low_level_execute_command() done: rc=0, stdout=, stderr= 10755 1726773084.04241: handler run complete 10755 1726773084.04256: attempt loop complete, returning result 10755 1726773084.04259: _execute() done 10755 1726773084.04263: dumping result to json 10755 1726773084.04267: done dumping result, returning 10755 1726773084.04275: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config [0affffe7-6841-7dd6-8fa6-0000000006e0] 10755 1726773084.04281: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006e0 10755 1726773084.04315: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006e0 10755 1726773084.04319: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "data": { "daemon": "1", "default_instance_priority": "0", "dynamic_tuning": "0", "log_file_count": "2", "log_file_max_size": "1MB", "reapply_sysctl": "1", "recommend_command": "1", "sleep_interval": "1", "udev_buffer_size": "1MB", "update_interval": "10" } } 9733 1726773084.04463: no more pending results, returning what we have 9733 1726773084.04467: results queue empty 9733 1726773084.04467: checking for any_errors_fatal 9733 1726773084.04473: done checking for any_errors_fatal 9733 1726773084.04474: checking for max_fail_percentage 9733 1726773084.04475: done checking for max_fail_percentage 9733 1726773084.04476: checking to see if all hosts have failed and the running result is not ok 9733 1726773084.04476: done checking to see if all hosts have failed 9733 1726773084.04477: getting the remaining hosts for this loop 9733 1726773084.04478: done getting the remaining hosts for this loop 9733 1726773084.04481: getting the next task for host managed_node3 9733 1726773084.04489: done getting next task for host managed_node3 9733 1726773084.04492: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 9733 1726773084.04496: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773084.04507: getting variables 9733 1726773084.04509: in VariableManager get_vars() 9733 1726773084.04543: Calling all_inventory to load vars for managed_node3 9733 1726773084.04546: Calling groups_inventory to load vars for managed_node3 9733 1726773084.04547: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773084.04557: Calling all_plugins_play to load vars for managed_node3 9733 1726773084.04559: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773084.04561: Calling groups_plugins_play to load vars for managed_node3 9733 1726773084.04676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773084.04803: done with get_vars() 9733 1726773084.04812: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 Thursday 19 September 2024 15:11:24 -0400 (0:00:00.332) 0:00:29.781 **** 9733 1726773084.04877: entering _queue_task() for managed_node3/stat 9733 1726773084.05052: worker is 1 (out of 1 available) 9733 1726773084.05068: exiting _queue_task() for managed_node3/stat 9733 1726773084.05080: done queuing things up, now waiting for results queue to drain 9733 1726773084.05082: waiting for pending results... 10763 1726773084.05223: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 10763 1726773084.05346: in run() - task 0affffe7-6841-7dd6-8fa6-0000000006e1 10763 1726773084.05366: variable 'ansible_search_path' from source: unknown 10763 1726773084.05369: variable 'ansible_search_path' from source: unknown 10763 1726773084.05411: variable '__prof_from_conf' from source: task vars 10763 1726773084.05657: variable '__prof_from_conf' from source: task vars 10763 1726773084.05855: variable '__data' from source: task vars 10763 1726773084.05931: variable '__kernel_settings_register_tuned_main' from source: set_fact 10763 1726773084.06153: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10763 1726773084.06164: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10763 1726773084.06227: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10763 1726773084.06328: variable 'omit' from source: magic vars 10763 1726773084.06426: variable 'ansible_host' from source: host vars for 'managed_node3' 10763 1726773084.06438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10763 1726773084.06449: variable 'omit' from source: magic vars 10763 1726773084.06674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10763 1726773084.08425: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10763 1726773084.08472: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10763 1726773084.08504: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10763 1726773084.08543: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10763 1726773084.08564: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10763 1726773084.08621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10763 1726773084.08642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10763 1726773084.08661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10763 1726773084.08691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10763 1726773084.08704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10763 1726773084.08768: variable 'item' from source: unknown 10763 1726773084.08781: Evaluated conditional (item | length > 0): False 10763 1726773084.08784: when evaluation is False, skipping this task 10763 1726773084.08811: variable 'item' from source: unknown 10763 1726773084.08858: variable 'item' from source: unknown skipping: [managed_node3] => (item=) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item | length > 0", "item": "", "skip_reason": "Conditional result was False" } 10763 1726773084.08914: variable 'ansible_host' from source: host vars for 'managed_node3' 10763 1726773084.08923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10763 1726773084.08930: variable 'omit' from source: magic vars 10763 1726773084.09023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10763 1726773084.09040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10763 1726773084.09055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10763 1726773084.09078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10763 1726773084.09089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10763 1726773084.09161: variable 'item' from source: unknown 10763 1726773084.09170: Evaluated conditional (item | length > 0): True 10763 1726773084.09177: variable 'omit' from source: magic vars 10763 1726773084.09224: variable 'omit' from source: magic vars 10763 1726773084.09270: variable 'item' from source: unknown 10763 1726773084.09316: variable 'item' from source: unknown 10763 1726773084.09330: variable 'omit' from source: magic vars 10763 1726773084.09349: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10763 1726773084.09369: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10763 1726773084.09384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10763 1726773084.09404: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10763 1726773084.09414: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10763 1726773084.09437: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10763 1726773084.09441: variable 'ansible_host' from source: host vars for 'managed_node3' 10763 1726773084.09445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10763 1726773084.09513: Set connection var ansible_timeout to 10 10763 1726773084.09518: Set connection var ansible_shell_type to sh 10763 1726773084.09523: Set connection var ansible_module_compression to ZIP_DEFLATED 10763 1726773084.09529: Set connection var ansible_shell_executable to /bin/sh 10763 1726773084.09534: Set connection var ansible_pipelining to False 10763 1726773084.09541: Set connection var ansible_connection to ssh 10763 1726773084.09554: variable 'ansible_shell_executable' from source: unknown 10763 1726773084.09557: variable 'ansible_connection' from source: unknown 10763 1726773084.09560: variable 'ansible_module_compression' from source: unknown 10763 1726773084.09564: variable 'ansible_shell_type' from source: unknown 10763 1726773084.09567: variable 'ansible_shell_executable' from source: unknown 10763 1726773084.09570: variable 'ansible_host' from source: host vars for 'managed_node3' 10763 1726773084.09575: variable 'ansible_pipelining' from source: unknown 10763 1726773084.09578: variable 'ansible_timeout' from source: unknown 10763 1726773084.09582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10763 1726773084.09669: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10763 1726773084.09679: variable 'omit' from source: magic vars 10763 1726773084.09688: starting attempt loop 10763 1726773084.09692: running the handler 10763 1726773084.09704: _low_level_execute_command(): starting 10763 1726773084.09712: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10763 1726773084.12051: stdout chunk (state=2): >>>/root <<< 10763 1726773084.12166: stderr chunk (state=3): >>><<< 10763 1726773084.12174: stdout chunk (state=3): >>><<< 10763 1726773084.12195: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10763 1726773084.12206: _low_level_execute_command(): starting 10763 1726773084.12212: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773084.1220229-10763-67809770313926 `" && echo ansible-tmp-1726773084.1220229-10763-67809770313926="` echo /root/.ansible/tmp/ansible-tmp-1726773084.1220229-10763-67809770313926 `" ) && sleep 0' 10763 1726773084.14696: stdout chunk (state=2): >>>ansible-tmp-1726773084.1220229-10763-67809770313926=/root/.ansible/tmp/ansible-tmp-1726773084.1220229-10763-67809770313926 <<< 10763 1726773084.14829: stderr chunk (state=3): >>><<< 10763 1726773084.14837: stdout chunk (state=3): >>><<< 10763 1726773084.14853: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773084.1220229-10763-67809770313926=/root/.ansible/tmp/ansible-tmp-1726773084.1220229-10763-67809770313926 , stderr= 10763 1726773084.14892: variable 'ansible_module_compression' from source: unknown 10763 1726773084.14933: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10763 1726773084.14964: variable 'ansible_facts' from source: unknown 10763 1726773084.15032: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773084.1220229-10763-67809770313926/AnsiballZ_stat.py 10763 1726773084.15132: Sending initial data 10763 1726773084.15139: Sent initial data (151 bytes) 10763 1726773084.17761: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpl73ypwlk /root/.ansible/tmp/ansible-tmp-1726773084.1220229-10763-67809770313926/AnsiballZ_stat.py <<< 10763 1726773084.18952: stderr chunk (state=3): >>><<< 10763 1726773084.18964: stdout chunk (state=3): >>><<< 10763 1726773084.18984: done transferring module to remote 10763 1726773084.18997: _low_level_execute_command(): starting 10763 1726773084.19002: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773084.1220229-10763-67809770313926/ /root/.ansible/tmp/ansible-tmp-1726773084.1220229-10763-67809770313926/AnsiballZ_stat.py && sleep 0' 10763 1726773084.21470: stderr chunk (state=2): >>><<< 10763 1726773084.21480: stdout chunk (state=2): >>><<< 10763 1726773084.21497: _low_level_execute_command() done: rc=0, stdout=, stderr= 10763 1726773084.21502: _low_level_execute_command(): starting 10763 1726773084.21508: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773084.1220229-10763-67809770313926/AnsiballZ_stat.py && sleep 0' 10763 1726773084.36580: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 10763 1726773084.37616: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10763 1726773084.37665: stderr chunk (state=3): >>><<< 10763 1726773084.37671: stdout chunk (state=3): >>><<< 10763 1726773084.37686: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.47.99 closed. 10763 1726773084.37706: done with _execute_module (stat, {'path': '/etc/tuned/profiles', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773084.1220229-10763-67809770313926/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10763 1726773084.37715: _low_level_execute_command(): starting 10763 1726773084.37718: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773084.1220229-10763-67809770313926/ > /dev/null 2>&1 && sleep 0' 10763 1726773084.40190: stderr chunk (state=2): >>><<< 10763 1726773084.40198: stdout chunk (state=2): >>><<< 10763 1726773084.40213: _low_level_execute_command() done: rc=0, stdout=, stderr= 10763 1726773084.40220: handler run complete 10763 1726773084.40235: attempt loop complete, returning result 10763 1726773084.40253: variable 'item' from source: unknown 10763 1726773084.40315: variable 'item' from source: unknown ok: [managed_node3] => (item=/etc/tuned/profiles) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned/profiles", "stat": { "exists": false } } 10763 1726773084.40404: variable 'ansible_host' from source: host vars for 'managed_node3' 10763 1726773084.40414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10763 1726773084.40423: variable 'omit' from source: magic vars 10763 1726773084.40530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10763 1726773084.40553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10763 1726773084.40572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10763 1726773084.40602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10763 1726773084.40614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10763 1726773084.40673: variable 'item' from source: unknown 10763 1726773084.40682: Evaluated conditional (item | length > 0): True 10763 1726773084.40692: variable 'omit' from source: magic vars 10763 1726773084.40705: variable 'omit' from source: magic vars 10763 1726773084.40735: variable 'item' from source: unknown 10763 1726773084.40783: variable 'item' from source: unknown 10763 1726773084.40799: variable 'omit' from source: magic vars 10763 1726773084.40816: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10763 1726773084.40825: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10763 1726773084.40831: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10763 1726773084.40843: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10763 1726773084.40847: variable 'ansible_host' from source: host vars for 'managed_node3' 10763 1726773084.40851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10763 1726773084.40905: Set connection var ansible_timeout to 10 10763 1726773084.40909: Set connection var ansible_shell_type to sh 10763 1726773084.40915: Set connection var ansible_module_compression to ZIP_DEFLATED 10763 1726773084.40920: Set connection var ansible_shell_executable to /bin/sh 10763 1726773084.40926: Set connection var ansible_pipelining to False 10763 1726773084.40932: Set connection var ansible_connection to ssh 10763 1726773084.40945: variable 'ansible_shell_executable' from source: unknown 10763 1726773084.40949: variable 'ansible_connection' from source: unknown 10763 1726773084.40952: variable 'ansible_module_compression' from source: unknown 10763 1726773084.40955: variable 'ansible_shell_type' from source: unknown 10763 1726773084.40958: variable 'ansible_shell_executable' from source: unknown 10763 1726773084.40962: variable 'ansible_host' from source: host vars for 'managed_node3' 10763 1726773084.40966: variable 'ansible_pipelining' from source: unknown 10763 1726773084.40969: variable 'ansible_timeout' from source: unknown 10763 1726773084.40973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10763 1726773084.41040: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10763 1726773084.41048: variable 'omit' from source: magic vars 10763 1726773084.41052: starting attempt loop 10763 1726773084.41054: running the handler 10763 1726773084.41058: _low_level_execute_command(): starting 10763 1726773084.41061: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10763 1726773084.43276: stdout chunk (state=2): >>>/root <<< 10763 1726773084.43397: stderr chunk (state=3): >>><<< 10763 1726773084.43405: stdout chunk (state=3): >>><<< 10763 1726773084.43421: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10763 1726773084.43430: _low_level_execute_command(): starting 10763 1726773084.43436: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773084.4342732-10763-28123116656608 `" && echo ansible-tmp-1726773084.4342732-10763-28123116656608="` echo /root/.ansible/tmp/ansible-tmp-1726773084.4342732-10763-28123116656608 `" ) && sleep 0' 10763 1726773084.45931: stdout chunk (state=2): >>>ansible-tmp-1726773084.4342732-10763-28123116656608=/root/.ansible/tmp/ansible-tmp-1726773084.4342732-10763-28123116656608 <<< 10763 1726773084.46062: stderr chunk (state=3): >>><<< 10763 1726773084.46070: stdout chunk (state=3): >>><<< 10763 1726773084.46084: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773084.4342732-10763-28123116656608=/root/.ansible/tmp/ansible-tmp-1726773084.4342732-10763-28123116656608 , stderr= 10763 1726773084.46117: variable 'ansible_module_compression' from source: unknown 10763 1726773084.46152: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10763 1726773084.46172: variable 'ansible_facts' from source: unknown 10763 1726773084.46227: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773084.4342732-10763-28123116656608/AnsiballZ_stat.py 10763 1726773084.46320: Sending initial data 10763 1726773084.46327: Sent initial data (151 bytes) 10763 1726773084.48881: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmp42x8mpd_ /root/.ansible/tmp/ansible-tmp-1726773084.4342732-10763-28123116656608/AnsiballZ_stat.py <<< 10763 1726773084.50065: stderr chunk (state=3): >>><<< 10763 1726773084.50073: stdout chunk (state=3): >>><<< 10763 1726773084.50094: done transferring module to remote 10763 1726773084.50104: _low_level_execute_command(): starting 10763 1726773084.50110: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773084.4342732-10763-28123116656608/ /root/.ansible/tmp/ansible-tmp-1726773084.4342732-10763-28123116656608/AnsiballZ_stat.py && sleep 0' 10763 1726773084.52523: stderr chunk (state=2): >>><<< 10763 1726773084.52534: stdout chunk (state=2): >>><<< 10763 1726773084.52548: _low_level_execute_command() done: rc=0, stdout=, stderr= 10763 1726773084.52553: _low_level_execute_command(): starting 10763 1726773084.52558: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773084.4342732-10763-28123116656608/AnsiballZ_stat.py && sleep 0' 10763 1726773084.68267: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726773065.3941085, "mtime": 1726773063.682102, "ctime": 1726773063.682102, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 10763 1726773084.69411: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10763 1726773084.69459: stderr chunk (state=3): >>><<< 10763 1726773084.69466: stdout chunk (state=3): >>><<< 10763 1726773084.69484: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726773065.3941085, "mtime": 1726773063.682102, "ctime": 1726773063.682102, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.47.99 closed. 10763 1726773084.69520: done with _execute_module (stat, {'path': '/etc/tuned', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773084.4342732-10763-28123116656608/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10763 1726773084.69528: _low_level_execute_command(): starting 10763 1726773084.69534: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773084.4342732-10763-28123116656608/ > /dev/null 2>&1 && sleep 0' 10763 1726773084.72011: stderr chunk (state=2): >>><<< 10763 1726773084.72024: stdout chunk (state=2): >>><<< 10763 1726773084.72039: _low_level_execute_command() done: rc=0, stdout=, stderr= 10763 1726773084.72046: handler run complete 10763 1726773084.72076: attempt loop complete, returning result 10763 1726773084.72093: variable 'item' from source: unknown 10763 1726773084.72157: variable 'item' from source: unknown ok: [managed_node3] => (item=/etc/tuned) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned", "stat": { "atime": 1726773065.3941085, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1726773063.682102, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 917919, "isblk": false, "ischr": false, "isdir": true, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/directory", "mode": "0755", "mtime": 1726773063.682102, "nlink": 4, "path": "/etc/tuned", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 159, "uid": 0, "version": "1785990601", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 10763 1726773084.72207: dumping result to json 10763 1726773084.72217: done dumping result, returning 10763 1726773084.72225: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory [0affffe7-6841-7dd6-8fa6-0000000006e1] 10763 1726773084.72230: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006e1 10763 1726773084.72269: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006e1 10763 1726773084.72273: WORKER PROCESS EXITING 9733 1726773084.72511: no more pending results, returning what we have 9733 1726773084.72513: results queue empty 9733 1726773084.72514: checking for any_errors_fatal 9733 1726773084.72518: done checking for any_errors_fatal 9733 1726773084.72519: checking for max_fail_percentage 9733 1726773084.72520: done checking for max_fail_percentage 9733 1726773084.72520: checking to see if all hosts have failed and the running result is not ok 9733 1726773084.72521: done checking to see if all hosts have failed 9733 1726773084.72521: getting the remaining hosts for this loop 9733 1726773084.72522: done getting the remaining hosts for this loop 9733 1726773084.72525: getting the next task for host managed_node3 9733 1726773084.72531: done getting next task for host managed_node3 9733 1726773084.72534: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 9733 1726773084.72537: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773084.72546: getting variables 9733 1726773084.72547: in VariableManager get_vars() 9733 1726773084.72572: Calling all_inventory to load vars for managed_node3 9733 1726773084.72574: Calling groups_inventory to load vars for managed_node3 9733 1726773084.72576: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773084.72586: Calling all_plugins_play to load vars for managed_node3 9733 1726773084.72589: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773084.72592: Calling groups_plugins_play to load vars for managed_node3 9733 1726773084.72684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773084.72802: done with get_vars() 9733 1726773084.72812: done getting variables 9733 1726773084.72852: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:63 Thursday 19 September 2024 15:11:24 -0400 (0:00:00.679) 0:00:30.461 **** 9733 1726773084.72874: entering _queue_task() for managed_node3/set_fact 9733 1726773084.73044: worker is 1 (out of 1 available) 9733 1726773084.73059: exiting _queue_task() for managed_node3/set_fact 9733 1726773084.73073: done queuing things up, now waiting for results queue to drain 9733 1726773084.73075: waiting for pending results... 10781 1726773084.73214: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 10781 1726773084.73334: in run() - task 0affffe7-6841-7dd6-8fa6-0000000006e2 10781 1726773084.73351: variable 'ansible_search_path' from source: unknown 10781 1726773084.73355: variable 'ansible_search_path' from source: unknown 10781 1726773084.73383: calling self._execute() 10781 1726773084.73457: variable 'ansible_host' from source: host vars for 'managed_node3' 10781 1726773084.73466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10781 1726773084.73475: variable 'omit' from source: magic vars 10781 1726773084.73554: variable 'omit' from source: magic vars 10781 1726773084.73596: variable 'omit' from source: magic vars 10781 1726773084.73914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10781 1726773084.75677: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10781 1726773084.75726: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10781 1726773084.75754: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10781 1726773084.75781: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10781 1726773084.75803: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10781 1726773084.75855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10781 1726773084.75872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10781 1726773084.75889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10781 1726773084.75918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10781 1726773084.75927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10781 1726773084.75956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10781 1726773084.75970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10781 1726773084.75986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10781 1726773084.76011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10781 1726773084.76021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10781 1726773084.76056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10781 1726773084.76070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10781 1726773084.76084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10781 1726773084.76114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10781 1726773084.76125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10781 1726773084.76287: variable '__kernel_settings_find_profile_dirs' from source: set_fact 10781 1726773084.76353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10781 1726773084.76477: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10781 1726773084.76507: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10781 1726773084.76529: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10781 1726773084.76551: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10781 1726773084.76579: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10781 1726773084.76602: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10781 1726773084.76622: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10781 1726773084.76640: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10781 1726773084.76679: variable 'omit' from source: magic vars 10781 1726773084.76702: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10781 1726773084.76722: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10781 1726773084.76737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10781 1726773084.76750: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10781 1726773084.76760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10781 1726773084.76784: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10781 1726773084.76791: variable 'ansible_host' from source: host vars for 'managed_node3' 10781 1726773084.76796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10781 1726773084.76860: Set connection var ansible_timeout to 10 10781 1726773084.76866: Set connection var ansible_shell_type to sh 10781 1726773084.76872: Set connection var ansible_module_compression to ZIP_DEFLATED 10781 1726773084.76877: Set connection var ansible_shell_executable to /bin/sh 10781 1726773084.76882: Set connection var ansible_pipelining to False 10781 1726773084.76891: Set connection var ansible_connection to ssh 10781 1726773084.76911: variable 'ansible_shell_executable' from source: unknown 10781 1726773084.76915: variable 'ansible_connection' from source: unknown 10781 1726773084.76919: variable 'ansible_module_compression' from source: unknown 10781 1726773084.76922: variable 'ansible_shell_type' from source: unknown 10781 1726773084.76925: variable 'ansible_shell_executable' from source: unknown 10781 1726773084.76928: variable 'ansible_host' from source: host vars for 'managed_node3' 10781 1726773084.76932: variable 'ansible_pipelining' from source: unknown 10781 1726773084.76935: variable 'ansible_timeout' from source: unknown 10781 1726773084.76940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10781 1726773084.77003: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10781 1726773084.77014: variable 'omit' from source: magic vars 10781 1726773084.77019: starting attempt loop 10781 1726773084.77021: running the handler 10781 1726773084.77028: handler run complete 10781 1726773084.77034: attempt loop complete, returning result 10781 1726773084.77036: _execute() done 10781 1726773084.77037: dumping result to json 10781 1726773084.77039: done dumping result, returning 10781 1726773084.77043: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir [0affffe7-6841-7dd6-8fa6-0000000006e2] 10781 1726773084.77047: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006e2 10781 1726773084.77062: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006e2 10781 1726773084.77064: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__kernel_settings_profile_parent": "/etc/tuned" }, "changed": false } 9733 1726773084.77374: no more pending results, returning what we have 9733 1726773084.77376: results queue empty 9733 1726773084.77377: checking for any_errors_fatal 9733 1726773084.77392: done checking for any_errors_fatal 9733 1726773084.77392: checking for max_fail_percentage 9733 1726773084.77393: done checking for max_fail_percentage 9733 1726773084.77394: checking to see if all hosts have failed and the running result is not ok 9733 1726773084.77394: done checking to see if all hosts have failed 9733 1726773084.77395: getting the remaining hosts for this loop 9733 1726773084.77396: done getting the remaining hosts for this loop 9733 1726773084.77398: getting the next task for host managed_node3 9733 1726773084.77403: done getting next task for host managed_node3 9733 1726773084.77406: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 9733 1726773084.77408: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773084.77415: getting variables 9733 1726773084.77416: in VariableManager get_vars() 9733 1726773084.77445: Calling all_inventory to load vars for managed_node3 9733 1726773084.77447: Calling groups_inventory to load vars for managed_node3 9733 1726773084.77448: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773084.77456: Calling all_plugins_play to load vars for managed_node3 9733 1726773084.77458: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773084.77459: Calling groups_plugins_play to load vars for managed_node3 9733 1726773084.77566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773084.77690: done with get_vars() 9733 1726773084.77699: done getting variables 9733 1726773084.77741: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 Thursday 19 September 2024 15:11:24 -0400 (0:00:00.048) 0:00:30.510 **** 9733 1726773084.77765: entering _queue_task() for managed_node3/service 9733 1726773084.77936: worker is 1 (out of 1 available) 9733 1726773084.77951: exiting _queue_task() for managed_node3/service 9733 1726773084.77964: done queuing things up, now waiting for results queue to drain 9733 1726773084.77966: waiting for pending results... 10782 1726773084.78102: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 10782 1726773084.78230: in run() - task 0affffe7-6841-7dd6-8fa6-0000000006e3 10782 1726773084.78246: variable 'ansible_search_path' from source: unknown 10782 1726773084.78250: variable 'ansible_search_path' from source: unknown 10782 1726773084.78284: variable '__kernel_settings_services' from source: include_vars 10782 1726773084.78584: variable '__kernel_settings_services' from source: include_vars 10782 1726773084.78644: variable 'omit' from source: magic vars 10782 1726773084.78716: variable 'ansible_host' from source: host vars for 'managed_node3' 10782 1726773084.78727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10782 1726773084.78736: variable 'omit' from source: magic vars 10782 1726773084.78790: variable 'omit' from source: magic vars 10782 1726773084.78828: variable 'omit' from source: magic vars 10782 1726773084.78863: variable 'item' from source: unknown 10782 1726773084.78919: variable 'item' from source: unknown 10782 1726773084.78939: variable 'omit' from source: magic vars 10782 1726773084.78971: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10782 1726773084.78997: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10782 1726773084.79017: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10782 1726773084.79031: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10782 1726773084.79041: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10782 1726773084.79064: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10782 1726773084.79069: variable 'ansible_host' from source: host vars for 'managed_node3' 10782 1726773084.79073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10782 1726773084.79142: Set connection var ansible_timeout to 10 10782 1726773084.79147: Set connection var ansible_shell_type to sh 10782 1726773084.79153: Set connection var ansible_module_compression to ZIP_DEFLATED 10782 1726773084.79159: Set connection var ansible_shell_executable to /bin/sh 10782 1726773084.79164: Set connection var ansible_pipelining to False 10782 1726773084.79172: Set connection var ansible_connection to ssh 10782 1726773084.79189: variable 'ansible_shell_executable' from source: unknown 10782 1726773084.79193: variable 'ansible_connection' from source: unknown 10782 1726773084.79196: variable 'ansible_module_compression' from source: unknown 10782 1726773084.79202: variable 'ansible_shell_type' from source: unknown 10782 1726773084.79205: variable 'ansible_shell_executable' from source: unknown 10782 1726773084.79208: variable 'ansible_host' from source: host vars for 'managed_node3' 10782 1726773084.79212: variable 'ansible_pipelining' from source: unknown 10782 1726773084.79215: variable 'ansible_timeout' from source: unknown 10782 1726773084.79220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10782 1726773084.79312: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10782 1726773084.79324: variable 'omit' from source: magic vars 10782 1726773084.79330: starting attempt loop 10782 1726773084.79334: running the handler 10782 1726773084.79397: variable 'ansible_facts' from source: unknown 10782 1726773084.79472: _low_level_execute_command(): starting 10782 1726773084.79481: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10782 1726773084.81852: stdout chunk (state=2): >>>/root <<< 10782 1726773084.81973: stderr chunk (state=3): >>><<< 10782 1726773084.81981: stdout chunk (state=3): >>><<< 10782 1726773084.82004: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10782 1726773084.82017: _low_level_execute_command(): starting 10782 1726773084.82024: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773084.8201199-10782-274875619580207 `" && echo ansible-tmp-1726773084.8201199-10782-274875619580207="` echo /root/.ansible/tmp/ansible-tmp-1726773084.8201199-10782-274875619580207 `" ) && sleep 0' 10782 1726773084.84513: stdout chunk (state=2): >>>ansible-tmp-1726773084.8201199-10782-274875619580207=/root/.ansible/tmp/ansible-tmp-1726773084.8201199-10782-274875619580207 <<< 10782 1726773084.84643: stderr chunk (state=3): >>><<< 10782 1726773084.84651: stdout chunk (state=3): >>><<< 10782 1726773084.84666: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773084.8201199-10782-274875619580207=/root/.ansible/tmp/ansible-tmp-1726773084.8201199-10782-274875619580207 , stderr= 10782 1726773084.84693: variable 'ansible_module_compression' from source: unknown 10782 1726773084.84734: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 10782 1726773084.84787: variable 'ansible_facts' from source: unknown 10782 1726773084.84944: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773084.8201199-10782-274875619580207/AnsiballZ_systemd.py 10782 1726773084.85053: Sending initial data 10782 1726773084.85060: Sent initial data (155 bytes) 10782 1726773084.87611: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpc8l8ujay /root/.ansible/tmp/ansible-tmp-1726773084.8201199-10782-274875619580207/AnsiballZ_systemd.py <<< 10782 1726773084.89639: stderr chunk (state=3): >>><<< 10782 1726773084.89648: stdout chunk (state=3): >>><<< 10782 1726773084.89668: done transferring module to remote 10782 1726773084.89679: _low_level_execute_command(): starting 10782 1726773084.89684: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773084.8201199-10782-274875619580207/ /root/.ansible/tmp/ansible-tmp-1726773084.8201199-10782-274875619580207/AnsiballZ_systemd.py && sleep 0' 10782 1726773084.92105: stderr chunk (state=2): >>><<< 10782 1726773084.92114: stdout chunk (state=2): >>><<< 10782 1726773084.92128: _low_level_execute_command() done: rc=0, stdout=, stderr= 10782 1726773084.92133: _low_level_execute_command(): starting 10782 1726773084.92138: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773084.8201199-10782-274875619580207/AnsiballZ_systemd.py && sleep 0' 10782 1726773085.19773: stdout chunk (state=2): >>> {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:11:05 EDT", "WatchdogTimestampMonotonic": "480455087", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "15004", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ExecMainStartTimestampMonotonic": "480313127", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "15004", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:11:05 EDT] ; stop_time=[n/a] ; pid=15004 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "15097856", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "Memo<<< 10782 1726773085.19814: stdout chunk (state=3): >>>ryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "power-profiles-daemon.service auto-cpufreq.service tlp.service cpupower.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "systemd-journald.socket polkit.service systemd-sysctl.service dbus.socket basic.target system.slice network.target dbus.service sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:11:05 EDT", "StateChangeTimestampMonotonic": "480455091", "InactiveExitTimestamp": "Thu 2024-09-19 15:11:05 EDT", "InactiveExitTimestampMonotonic": "480313185", "ActiveEnterTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ActiveEnterTimestampMonotonic": "480455091", "ActiveExitTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ActiveExitTimestampMonotonic": "480218270", "InactiveEnterTimestamp": "Thu 2024-09-19 15:11:05 EDT", "InactiveEnterTimestampMonotonic": "480310218", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ConditionTimestampMonotonic": "480311469", "AssertTimestamp": "Thu 2024-09-19 15:11:05 EDT", "AssertTimestampMonotonic": "480311470", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5c390172c7314a188777ca74147bd412", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 10782 1726773085.21565: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10782 1726773085.21576: stdout chunk (state=3): >>><<< 10782 1726773085.21590: stderr chunk (state=3): >>><<< 10782 1726773085.21612: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:11:05 EDT", "WatchdogTimestampMonotonic": "480455087", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "15004", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ExecMainStartTimestampMonotonic": "480313127", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "15004", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:11:05 EDT] ; stop_time=[n/a] ; pid=15004 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "15097856", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "power-profiles-daemon.service auto-cpufreq.service tlp.service cpupower.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "systemd-journald.socket polkit.service systemd-sysctl.service dbus.socket basic.target system.slice network.target dbus.service sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:11:05 EDT", "StateChangeTimestampMonotonic": "480455091", "InactiveExitTimestamp": "Thu 2024-09-19 15:11:05 EDT", "InactiveExitTimestampMonotonic": "480313185", "ActiveEnterTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ActiveEnterTimestampMonotonic": "480455091", "ActiveExitTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ActiveExitTimestampMonotonic": "480218270", "InactiveEnterTimestamp": "Thu 2024-09-19 15:11:05 EDT", "InactiveEnterTimestampMonotonic": "480310218", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ConditionTimestampMonotonic": "480311469", "AssertTimestamp": "Thu 2024-09-19 15:11:05 EDT", "AssertTimestampMonotonic": "480311470", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5c390172c7314a188777ca74147bd412", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.47.99 closed. 10782 1726773085.22091: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773084.8201199-10782-274875619580207/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10782 1726773085.22110: _low_level_execute_command(): starting 10782 1726773085.22117: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773084.8201199-10782-274875619580207/ > /dev/null 2>&1 && sleep 0' 10782 1726773085.24653: stderr chunk (state=2): >>><<< 10782 1726773085.24665: stdout chunk (state=2): >>><<< 10782 1726773085.24681: _low_level_execute_command() done: rc=0, stdout=, stderr= 10782 1726773085.24691: handler run complete 10782 1726773085.24750: attempt loop complete, returning result 10782 1726773085.24770: variable 'item' from source: unknown 10782 1726773085.24848: variable 'item' from source: unknown ok: [managed_node3] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ActiveEnterTimestampMonotonic": "480455091", "ActiveExitTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ActiveExitTimestampMonotonic": "480218270", "ActiveState": "active", "After": "systemd-journald.socket polkit.service systemd-sysctl.service dbus.socket basic.target system.slice network.target dbus.service sysinit.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 15:11:05 EDT", "AssertTimestampMonotonic": "480311470", "Before": "shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ConditionTimestampMonotonic": "480311469", "ConfigurationDirectoryMode": "0755", "Conflicts": "power-profiles-daemon.service auto-cpufreq.service tlp.service cpupower.service shutdown.target", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "15004", "ExecMainStartTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ExecMainStartTimestampMonotonic": "480313127", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:11:05 EDT] ; stop_time=[n/a] ; pid=15004 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestamp": "Thu 2024-09-19 15:11:05 EDT", "InactiveEnterTimestampMonotonic": "480310218", "InactiveExitTimestamp": "Thu 2024-09-19 15:11:05 EDT", "InactiveExitTimestampMonotonic": "480313185", "InvocationID": "5c390172c7314a188777ca74147bd412", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "15004", "MemoryAccounting": "yes", "MemoryCurrent": "15097856", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 15:11:05 EDT", "StateChangeTimestampMonotonic": "480455091", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 15:11:05 EDT", "WatchdogTimestampMonotonic": "480455087", "WatchdogUSec": "0" } } 10782 1726773085.24959: dumping result to json 10782 1726773085.24978: done dumping result, returning 10782 1726773085.24989: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started [0affffe7-6841-7dd6-8fa6-0000000006e3] 10782 1726773085.24996: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006e3 9733 1726773085.25708: no more pending results, returning what we have 9733 1726773085.25711: results queue empty 9733 1726773085.25712: checking for any_errors_fatal 9733 1726773085.25715: done checking for any_errors_fatal 9733 1726773085.25716: checking for max_fail_percentage 9733 1726773085.25717: done checking for max_fail_percentage 9733 1726773085.25717: checking to see if all hosts have failed and the running result is not ok 9733 1726773085.25718: done checking to see if all hosts have failed 9733 1726773085.25718: getting the remaining hosts for this loop 9733 1726773085.25719: done getting the remaining hosts for this loop 9733 1726773085.25722: getting the next task for host managed_node3 9733 1726773085.25727: done getting next task for host managed_node3 9733 1726773085.25730: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 9733 1726773085.25732: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773085.25742: getting variables 9733 1726773085.25743: in VariableManager get_vars() 9733 1726773085.25769: Calling all_inventory to load vars for managed_node3 9733 1726773085.25772: Calling groups_inventory to load vars for managed_node3 9733 1726773085.25774: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773085.25783: Calling all_plugins_play to load vars for managed_node3 9733 1726773085.25788: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773085.25791: Calling groups_plugins_play to load vars for managed_node3 9733 1726773085.25943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773085.26142: done with get_vars() 9733 1726773085.26152: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74 Thursday 19 September 2024 15:11:25 -0400 (0:00:00.484) 0:00:30.994 **** 9733 1726773085.26246: entering _queue_task() for managed_node3/file 9733 1726773085.26458: worker is 1 (out of 1 available) 9733 1726773085.26474: exiting _queue_task() for managed_node3/file 9733 1726773085.26489: done queuing things up, now waiting for results queue to drain 9733 1726773085.26491: waiting for pending results... 10782 1726773085.25110: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006e3 10782 1726773085.25115: WORKER PROCESS EXITING 10803 1726773085.26639: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 10803 1726773085.26766: in run() - task 0affffe7-6841-7dd6-8fa6-0000000006e4 10803 1726773085.26784: variable 'ansible_search_path' from source: unknown 10803 1726773085.26790: variable 'ansible_search_path' from source: unknown 10803 1726773085.26818: calling self._execute() 10803 1726773085.26893: variable 'ansible_host' from source: host vars for 'managed_node3' 10803 1726773085.26902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10803 1726773085.26910: variable 'omit' from source: magic vars 10803 1726773085.26988: variable 'omit' from source: magic vars 10803 1726773085.27026: variable 'omit' from source: magic vars 10803 1726773085.27049: variable '__kernel_settings_profile_dir' from source: role '' all vars 10803 1726773085.27273: variable '__kernel_settings_profile_dir' from source: role '' all vars 10803 1726773085.27349: variable '__kernel_settings_profile_parent' from source: set_fact 10803 1726773085.27359: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10803 1726773085.27422: variable 'omit' from source: magic vars 10803 1726773085.27454: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10803 1726773085.27481: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10803 1726773085.27502: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10803 1726773085.27518: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10803 1726773085.27529: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10803 1726773085.27552: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10803 1726773085.27558: variable 'ansible_host' from source: host vars for 'managed_node3' 10803 1726773085.27562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10803 1726773085.27635: Set connection var ansible_timeout to 10 10803 1726773085.27640: Set connection var ansible_shell_type to sh 10803 1726773085.27646: Set connection var ansible_module_compression to ZIP_DEFLATED 10803 1726773085.27651: Set connection var ansible_shell_executable to /bin/sh 10803 1726773085.27657: Set connection var ansible_pipelining to False 10803 1726773085.27664: Set connection var ansible_connection to ssh 10803 1726773085.27680: variable 'ansible_shell_executable' from source: unknown 10803 1726773085.27684: variable 'ansible_connection' from source: unknown 10803 1726773085.27689: variable 'ansible_module_compression' from source: unknown 10803 1726773085.27693: variable 'ansible_shell_type' from source: unknown 10803 1726773085.27696: variable 'ansible_shell_executable' from source: unknown 10803 1726773085.27700: variable 'ansible_host' from source: host vars for 'managed_node3' 10803 1726773085.27704: variable 'ansible_pipelining' from source: unknown 10803 1726773085.27707: variable 'ansible_timeout' from source: unknown 10803 1726773085.27712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10803 1726773085.27852: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10803 1726773085.27864: variable 'omit' from source: magic vars 10803 1726773085.27870: starting attempt loop 10803 1726773085.27874: running the handler 10803 1726773085.27887: _low_level_execute_command(): starting 10803 1726773085.27895: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10803 1726773085.30352: stdout chunk (state=2): >>>/root <<< 10803 1726773085.30468: stderr chunk (state=3): >>><<< 10803 1726773085.30475: stdout chunk (state=3): >>><<< 10803 1726773085.30494: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10803 1726773085.30509: _low_level_execute_command(): starting 10803 1726773085.30516: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773085.305041-10803-253065315649020 `" && echo ansible-tmp-1726773085.305041-10803-253065315649020="` echo /root/.ansible/tmp/ansible-tmp-1726773085.305041-10803-253065315649020 `" ) && sleep 0' 10803 1726773085.33015: stdout chunk (state=2): >>>ansible-tmp-1726773085.305041-10803-253065315649020=/root/.ansible/tmp/ansible-tmp-1726773085.305041-10803-253065315649020 <<< 10803 1726773085.33137: stderr chunk (state=3): >>><<< 10803 1726773085.33143: stdout chunk (state=3): >>><<< 10803 1726773085.33156: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773085.305041-10803-253065315649020=/root/.ansible/tmp/ansible-tmp-1726773085.305041-10803-253065315649020 , stderr= 10803 1726773085.33194: variable 'ansible_module_compression' from source: unknown 10803 1726773085.33238: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 10803 1726773085.33268: variable 'ansible_facts' from source: unknown 10803 1726773085.33341: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773085.305041-10803-253065315649020/AnsiballZ_file.py 10803 1726773085.33438: Sending initial data 10803 1726773085.33445: Sent initial data (151 bytes) 10803 1726773085.36004: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmphg5u300f /root/.ansible/tmp/ansible-tmp-1726773085.305041-10803-253065315649020/AnsiballZ_file.py <<< 10803 1726773085.37210: stderr chunk (state=3): >>><<< 10803 1726773085.37218: stdout chunk (state=3): >>><<< 10803 1726773085.37237: done transferring module to remote 10803 1726773085.37248: _low_level_execute_command(): starting 10803 1726773085.37254: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773085.305041-10803-253065315649020/ /root/.ansible/tmp/ansible-tmp-1726773085.305041-10803-253065315649020/AnsiballZ_file.py && sleep 0' 10803 1726773085.39775: stderr chunk (state=2): >>><<< 10803 1726773085.39788: stdout chunk (state=2): >>><<< 10803 1726773085.39806: _low_level_execute_command() done: rc=0, stdout=, stderr= 10803 1726773085.39811: _low_level_execute_command(): starting 10803 1726773085.39816: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773085.305041-10803-253065315649020/AnsiballZ_file.py && sleep 0' 10803 1726773085.56169: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 10803 1726773085.57328: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10803 1726773085.57373: stderr chunk (state=3): >>><<< 10803 1726773085.57380: stdout chunk (state=3): >>><<< 10803 1726773085.57401: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.47.99 closed. 10803 1726773085.57435: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'directory', 'mode': '0755', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773085.305041-10803-253065315649020/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10803 1726773085.57446: _low_level_execute_command(): starting 10803 1726773085.57452: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773085.305041-10803-253065315649020/ > /dev/null 2>&1 && sleep 0' 10803 1726773085.59929: stderr chunk (state=2): >>><<< 10803 1726773085.59941: stdout chunk (state=2): >>><<< 10803 1726773085.59957: _low_level_execute_command() done: rc=0, stdout=, stderr= 10803 1726773085.59964: handler run complete 10803 1726773085.59983: attempt loop complete, returning result 10803 1726773085.59989: _execute() done 10803 1726773085.59992: dumping result to json 10803 1726773085.59998: done dumping result, returning 10803 1726773085.60009: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists [0affffe7-6841-7dd6-8fa6-0000000006e4] 10803 1726773085.60015: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006e4 10803 1726773085.60049: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006e4 10803 1726773085.60053: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/tuned/kernel_settings", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "state": "directory", "uid": 0 } 9733 1726773085.60326: no more pending results, returning what we have 9733 1726773085.60328: results queue empty 9733 1726773085.60329: checking for any_errors_fatal 9733 1726773085.60341: done checking for any_errors_fatal 9733 1726773085.60341: checking for max_fail_percentage 9733 1726773085.60342: done checking for max_fail_percentage 9733 1726773085.60343: checking to see if all hosts have failed and the running result is not ok 9733 1726773085.60343: done checking to see if all hosts have failed 9733 1726773085.60343: getting the remaining hosts for this loop 9733 1726773085.60344: done getting the remaining hosts for this loop 9733 1726773085.60347: getting the next task for host managed_node3 9733 1726773085.60351: done getting next task for host managed_node3 9733 1726773085.60354: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 9733 1726773085.60357: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773085.60363: getting variables 9733 1726773085.60364: in VariableManager get_vars() 9733 1726773085.60397: Calling all_inventory to load vars for managed_node3 9733 1726773085.60399: Calling groups_inventory to load vars for managed_node3 9733 1726773085.60403: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773085.60410: Calling all_plugins_play to load vars for managed_node3 9733 1726773085.60412: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773085.60414: Calling groups_plugins_play to load vars for managed_node3 9733 1726773085.60526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773085.60662: done with get_vars() 9733 1726773085.60678: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get active_profile] ********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:80 Thursday 19 September 2024 15:11:25 -0400 (0:00:00.345) 0:00:31.340 **** 9733 1726773085.60768: entering _queue_task() for managed_node3/slurp 9733 1726773085.60977: worker is 1 (out of 1 available) 9733 1726773085.60992: exiting _queue_task() for managed_node3/slurp 9733 1726773085.61009: done queuing things up, now waiting for results queue to drain 9733 1726773085.61013: waiting for pending results... 10818 1726773085.61253: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 10818 1726773085.61421: in run() - task 0affffe7-6841-7dd6-8fa6-0000000006e5 10818 1726773085.61446: variable 'ansible_search_path' from source: unknown 10818 1726773085.61451: variable 'ansible_search_path' from source: unknown 10818 1726773085.61487: calling self._execute() 10818 1726773085.61581: variable 'ansible_host' from source: host vars for 'managed_node3' 10818 1726773085.61595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10818 1726773085.61607: variable 'omit' from source: magic vars 10818 1726773085.61713: variable 'omit' from source: magic vars 10818 1726773085.61767: variable 'omit' from source: magic vars 10818 1726773085.61803: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 10818 1726773085.62101: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 10818 1726773085.62179: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10818 1726773085.62216: variable 'omit' from source: magic vars 10818 1726773085.62251: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10818 1726773085.62372: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10818 1726773085.62397: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10818 1726773085.62416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10818 1726773085.62430: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10818 1726773085.62461: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10818 1726773085.62467: variable 'ansible_host' from source: host vars for 'managed_node3' 10818 1726773085.62471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10818 1726773085.62570: Set connection var ansible_timeout to 10 10818 1726773085.62576: Set connection var ansible_shell_type to sh 10818 1726773085.62582: Set connection var ansible_module_compression to ZIP_DEFLATED 10818 1726773085.62588: Set connection var ansible_shell_executable to /bin/sh 10818 1726773085.62593: Set connection var ansible_pipelining to False 10818 1726773085.62600: Set connection var ansible_connection to ssh 10818 1726773085.62618: variable 'ansible_shell_executable' from source: unknown 10818 1726773085.62623: variable 'ansible_connection' from source: unknown 10818 1726773085.62626: variable 'ansible_module_compression' from source: unknown 10818 1726773085.62629: variable 'ansible_shell_type' from source: unknown 10818 1726773085.62631: variable 'ansible_shell_executable' from source: unknown 10818 1726773085.62634: variable 'ansible_host' from source: host vars for 'managed_node3' 10818 1726773085.62637: variable 'ansible_pipelining' from source: unknown 10818 1726773085.62640: variable 'ansible_timeout' from source: unknown 10818 1726773085.62643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10818 1726773085.62820: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10818 1726773085.62834: variable 'omit' from source: magic vars 10818 1726773085.62841: starting attempt loop 10818 1726773085.62845: running the handler 10818 1726773085.62858: _low_level_execute_command(): starting 10818 1726773085.62866: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10818 1726773085.65250: stdout chunk (state=2): >>>/root <<< 10818 1726773085.65376: stderr chunk (state=3): >>><<< 10818 1726773085.65383: stdout chunk (state=3): >>><<< 10818 1726773085.65401: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10818 1726773085.65415: _low_level_execute_command(): starting 10818 1726773085.65420: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773085.6540737-10818-201527187561379 `" && echo ansible-tmp-1726773085.6540737-10818-201527187561379="` echo /root/.ansible/tmp/ansible-tmp-1726773085.6540737-10818-201527187561379 `" ) && sleep 0' 10818 1726773085.67897: stdout chunk (state=2): >>>ansible-tmp-1726773085.6540737-10818-201527187561379=/root/.ansible/tmp/ansible-tmp-1726773085.6540737-10818-201527187561379 <<< 10818 1726773085.68030: stderr chunk (state=3): >>><<< 10818 1726773085.68037: stdout chunk (state=3): >>><<< 10818 1726773085.68052: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773085.6540737-10818-201527187561379=/root/.ansible/tmp/ansible-tmp-1726773085.6540737-10818-201527187561379 , stderr= 10818 1726773085.68091: variable 'ansible_module_compression' from source: unknown 10818 1726773085.68125: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.slurp-ZIP_DEFLATED 10818 1726773085.68152: variable 'ansible_facts' from source: unknown 10818 1726773085.68225: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773085.6540737-10818-201527187561379/AnsiballZ_slurp.py 10818 1726773085.68329: Sending initial data 10818 1726773085.68337: Sent initial data (153 bytes) 10818 1726773085.70893: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmptqzwor3i /root/.ansible/tmp/ansible-tmp-1726773085.6540737-10818-201527187561379/AnsiballZ_slurp.py <<< 10818 1726773085.72058: stderr chunk (state=3): >>><<< 10818 1726773085.72067: stdout chunk (state=3): >>><<< 10818 1726773085.72083: done transferring module to remote 10818 1726773085.72095: _low_level_execute_command(): starting 10818 1726773085.72099: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773085.6540737-10818-201527187561379/ /root/.ansible/tmp/ansible-tmp-1726773085.6540737-10818-201527187561379/AnsiballZ_slurp.py && sleep 0' 10818 1726773085.74496: stderr chunk (state=2): >>><<< 10818 1726773085.74505: stdout chunk (state=2): >>><<< 10818 1726773085.74519: _low_level_execute_command() done: rc=0, stdout=, stderr= 10818 1726773085.74523: _low_level_execute_command(): starting 10818 1726773085.74528: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773085.6540737-10818-201527187561379/AnsiballZ_slurp.py && sleep 0' 10818 1726773085.89268: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 10818 1726773085.90479: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10818 1726773085.90496: stdout chunk (state=3): >>><<< 10818 1726773085.90511: stderr chunk (state=3): >>><<< 10818 1726773085.90521: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.47.99 closed. 10818 1726773085.90542: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773085.6540737-10818-201527187561379/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10818 1726773085.90550: _low_level_execute_command(): starting 10818 1726773085.90555: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773085.6540737-10818-201527187561379/ > /dev/null 2>&1 && sleep 0' 10818 1726773085.93251: stderr chunk (state=2): >>><<< 10818 1726773085.93262: stdout chunk (state=2): >>><<< 10818 1726773085.93278: _low_level_execute_command() done: rc=0, stdout=, stderr= 10818 1726773085.93288: handler run complete 10818 1726773085.93307: attempt loop complete, returning result 10818 1726773085.93312: _execute() done 10818 1726773085.93316: dumping result to json 10818 1726773085.93320: done dumping result, returning 10818 1726773085.93327: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile [0affffe7-6841-7dd6-8fa6-0000000006e5] 10818 1726773085.93335: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006e5 10818 1726773085.93371: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006e5 10818 1726773085.93373: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "encoding": "base64", "source": "/etc/tuned/active_profile" } 9733 1726773085.93634: no more pending results, returning what we have 9733 1726773085.93637: results queue empty 9733 1726773085.93638: checking for any_errors_fatal 9733 1726773085.93646: done checking for any_errors_fatal 9733 1726773085.93647: checking for max_fail_percentage 9733 1726773085.93648: done checking for max_fail_percentage 9733 1726773085.93648: checking to see if all hosts have failed and the running result is not ok 9733 1726773085.93649: done checking to see if all hosts have failed 9733 1726773085.93650: getting the remaining hosts for this loop 9733 1726773085.93651: done getting the remaining hosts for this loop 9733 1726773085.93654: getting the next task for host managed_node3 9733 1726773085.93660: done getting next task for host managed_node3 9733 1726773085.93663: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 9733 1726773085.93666: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773085.93677: getting variables 9733 1726773085.93678: in VariableManager get_vars() 9733 1726773085.93709: Calling all_inventory to load vars for managed_node3 9733 1726773085.93711: Calling groups_inventory to load vars for managed_node3 9733 1726773085.93712: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773085.93720: Calling all_plugins_play to load vars for managed_node3 9733 1726773085.93722: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773085.93724: Calling groups_plugins_play to load vars for managed_node3 9733 1726773085.93834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773085.94009: done with get_vars() 9733 1726773085.94020: done getting variables 9733 1726773085.94076: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set active_profile] ********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:85 Thursday 19 September 2024 15:11:25 -0400 (0:00:00.333) 0:00:31.673 **** 9733 1726773085.94112: entering _queue_task() for managed_node3/set_fact 9733 1726773085.94329: worker is 1 (out of 1 available) 9733 1726773085.94343: exiting _queue_task() for managed_node3/set_fact 9733 1726773085.94356: done queuing things up, now waiting for results queue to drain 9733 1726773085.94357: waiting for pending results... 10834 1726773085.94612: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 10834 1726773085.94789: in run() - task 0affffe7-6841-7dd6-8fa6-0000000006e6 10834 1726773085.94809: variable 'ansible_search_path' from source: unknown 10834 1726773085.94813: variable 'ansible_search_path' from source: unknown 10834 1726773085.94844: calling self._execute() 10834 1726773085.94921: variable 'ansible_host' from source: host vars for 'managed_node3' 10834 1726773085.94930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10834 1726773085.94939: variable 'omit' from source: magic vars 10834 1726773085.95016: variable 'omit' from source: magic vars 10834 1726773085.95060: variable 'omit' from source: magic vars 10834 1726773085.95365: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10834 1726773085.95374: variable '__cur_profile' from source: task vars 10834 1726773085.95477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10834 1726773085.97036: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10834 1726773085.97092: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10834 1726773085.97125: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10834 1726773085.97152: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10834 1726773085.97172: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10834 1726773085.97289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10834 1726773085.97320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10834 1726773085.97345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10834 1726773085.97386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10834 1726773085.97405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10834 1726773085.97516: variable '__kernel_settings_tuned_current_profile' from source: set_fact 10834 1726773085.97567: variable 'omit' from source: magic vars 10834 1726773085.97593: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10834 1726773085.97621: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10834 1726773085.97639: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10834 1726773085.97655: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10834 1726773085.97664: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10834 1726773085.97693: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10834 1726773085.97698: variable 'ansible_host' from source: host vars for 'managed_node3' 10834 1726773085.97704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10834 1726773085.97799: Set connection var ansible_timeout to 10 10834 1726773085.97808: Set connection var ansible_shell_type to sh 10834 1726773085.97815: Set connection var ansible_module_compression to ZIP_DEFLATED 10834 1726773085.97821: Set connection var ansible_shell_executable to /bin/sh 10834 1726773085.97827: Set connection var ansible_pipelining to False 10834 1726773085.97834: Set connection var ansible_connection to ssh 10834 1726773085.97854: variable 'ansible_shell_executable' from source: unknown 10834 1726773085.97859: variable 'ansible_connection' from source: unknown 10834 1726773085.97862: variable 'ansible_module_compression' from source: unknown 10834 1726773085.97865: variable 'ansible_shell_type' from source: unknown 10834 1726773085.97868: variable 'ansible_shell_executable' from source: unknown 10834 1726773085.97871: variable 'ansible_host' from source: host vars for 'managed_node3' 10834 1726773085.97875: variable 'ansible_pipelining' from source: unknown 10834 1726773085.97878: variable 'ansible_timeout' from source: unknown 10834 1726773085.97882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10834 1726773085.97965: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10834 1726773085.97977: variable 'omit' from source: magic vars 10834 1726773085.97982: starting attempt loop 10834 1726773085.97984: running the handler 10834 1726773085.97996: handler run complete 10834 1726773085.98008: attempt loop complete, returning result 10834 1726773085.98012: _execute() done 10834 1726773085.98015: dumping result to json 10834 1726773085.98019: done dumping result, returning 10834 1726773085.98025: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile [0affffe7-6841-7dd6-8fa6-0000000006e6] 10834 1726773085.98031: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006e6 10834 1726773085.98055: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006e6 10834 1726773085.98058: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__kernel_settings_active_profile": "virtual-guest kernel_settings" }, "changed": false } 9733 1726773085.98193: no more pending results, returning what we have 9733 1726773085.98196: results queue empty 9733 1726773085.98196: checking for any_errors_fatal 9733 1726773085.98203: done checking for any_errors_fatal 9733 1726773085.98204: checking for max_fail_percentage 9733 1726773085.98205: done checking for max_fail_percentage 9733 1726773085.98206: checking to see if all hosts have failed and the running result is not ok 9733 1726773085.98206: done checking to see if all hosts have failed 9733 1726773085.98207: getting the remaining hosts for this loop 9733 1726773085.98208: done getting the remaining hosts for this loop 9733 1726773085.98211: getting the next task for host managed_node3 9733 1726773085.98217: done getting next task for host managed_node3 9733 1726773085.98220: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 9733 1726773085.98224: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773085.98239: getting variables 9733 1726773085.98240: in VariableManager get_vars() 9733 1726773085.98274: Calling all_inventory to load vars for managed_node3 9733 1726773085.98276: Calling groups_inventory to load vars for managed_node3 9733 1726773085.98278: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773085.98290: Calling all_plugins_play to load vars for managed_node3 9733 1726773085.98292: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773085.98295: Calling groups_plugins_play to load vars for managed_node3 9733 1726773085.98418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773085.98552: done with get_vars() 9733 1726773085.98561: done getting variables 9733 1726773085.98605: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 Thursday 19 September 2024 15:11:25 -0400 (0:00:00.045) 0:00:31.718 **** 9733 1726773085.98630: entering _queue_task() for managed_node3/copy 9733 1726773085.98799: worker is 1 (out of 1 available) 9733 1726773085.98814: exiting _queue_task() for managed_node3/copy 9733 1726773085.98825: done queuing things up, now waiting for results queue to drain 9733 1726773085.98828: waiting for pending results... 10836 1726773085.98965: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 10836 1726773085.99102: in run() - task 0affffe7-6841-7dd6-8fa6-0000000006e7 10836 1726773085.99120: variable 'ansible_search_path' from source: unknown 10836 1726773085.99124: variable 'ansible_search_path' from source: unknown 10836 1726773085.99150: calling self._execute() 10836 1726773085.99226: variable 'ansible_host' from source: host vars for 'managed_node3' 10836 1726773085.99233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10836 1726773085.99239: variable 'omit' from source: magic vars 10836 1726773085.99313: variable 'omit' from source: magic vars 10836 1726773085.99350: variable 'omit' from source: magic vars 10836 1726773085.99369: variable '__kernel_settings_active_profile' from source: set_fact 10836 1726773085.99602: variable '__kernel_settings_active_profile' from source: set_fact 10836 1726773085.99625: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 10836 1726773085.99677: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 10836 1726773085.99735: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10836 1726773085.99820: variable 'omit' from source: magic vars 10836 1726773085.99854: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10836 1726773085.99880: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10836 1726773085.99899: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10836 1726773085.99913: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10836 1726773085.99924: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10836 1726773085.99948: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10836 1726773085.99953: variable 'ansible_host' from source: host vars for 'managed_node3' 10836 1726773085.99958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10836 1726773086.00027: Set connection var ansible_timeout to 10 10836 1726773086.00032: Set connection var ansible_shell_type to sh 10836 1726773086.00037: Set connection var ansible_module_compression to ZIP_DEFLATED 10836 1726773086.00040: Set connection var ansible_shell_executable to /bin/sh 10836 1726773086.00043: Set connection var ansible_pipelining to False 10836 1726773086.00047: Set connection var ansible_connection to ssh 10836 1726773086.00061: variable 'ansible_shell_executable' from source: unknown 10836 1726773086.00064: variable 'ansible_connection' from source: unknown 10836 1726773086.00066: variable 'ansible_module_compression' from source: unknown 10836 1726773086.00067: variable 'ansible_shell_type' from source: unknown 10836 1726773086.00069: variable 'ansible_shell_executable' from source: unknown 10836 1726773086.00070: variable 'ansible_host' from source: host vars for 'managed_node3' 10836 1726773086.00073: variable 'ansible_pipelining' from source: unknown 10836 1726773086.00074: variable 'ansible_timeout' from source: unknown 10836 1726773086.00076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10836 1726773086.00164: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10836 1726773086.00174: variable 'omit' from source: magic vars 10836 1726773086.00178: starting attempt loop 10836 1726773086.00180: running the handler 10836 1726773086.00192: _low_level_execute_command(): starting 10836 1726773086.00201: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10836 1726773086.02603: stdout chunk (state=2): >>>/root <<< 10836 1726773086.02723: stderr chunk (state=3): >>><<< 10836 1726773086.02730: stdout chunk (state=3): >>><<< 10836 1726773086.02749: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10836 1726773086.02762: _low_level_execute_command(): starting 10836 1726773086.02768: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773086.0275621-10836-177124013615584 `" && echo ansible-tmp-1726773086.0275621-10836-177124013615584="` echo /root/.ansible/tmp/ansible-tmp-1726773086.0275621-10836-177124013615584 `" ) && sleep 0' 10836 1726773086.05308: stdout chunk (state=2): >>>ansible-tmp-1726773086.0275621-10836-177124013615584=/root/.ansible/tmp/ansible-tmp-1726773086.0275621-10836-177124013615584 <<< 10836 1726773086.05436: stderr chunk (state=3): >>><<< 10836 1726773086.05442: stdout chunk (state=3): >>><<< 10836 1726773086.05459: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773086.0275621-10836-177124013615584=/root/.ansible/tmp/ansible-tmp-1726773086.0275621-10836-177124013615584 , stderr= 10836 1726773086.05533: variable 'ansible_module_compression' from source: unknown 10836 1726773086.05576: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10836 1726773086.05612: variable 'ansible_facts' from source: unknown 10836 1726773086.05677: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773086.0275621-10836-177124013615584/AnsiballZ_stat.py 10836 1726773086.05766: Sending initial data 10836 1726773086.05773: Sent initial data (152 bytes) 10836 1726773086.08419: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpfpbwhmob /root/.ansible/tmp/ansible-tmp-1726773086.0275621-10836-177124013615584/AnsiballZ_stat.py <<< 10836 1726773086.09590: stderr chunk (state=3): >>><<< 10836 1726773086.09599: stdout chunk (state=3): >>><<< 10836 1726773086.09622: done transferring module to remote 10836 1726773086.09633: _low_level_execute_command(): starting 10836 1726773086.09638: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773086.0275621-10836-177124013615584/ /root/.ansible/tmp/ansible-tmp-1726773086.0275621-10836-177124013615584/AnsiballZ_stat.py && sleep 0' 10836 1726773086.12105: stderr chunk (state=2): >>><<< 10836 1726773086.12117: stdout chunk (state=2): >>><<< 10836 1726773086.12132: _low_level_execute_command() done: rc=0, stdout=, stderr= 10836 1726773086.12136: _low_level_execute_command(): starting 10836 1726773086.12141: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773086.0275621-10836-177124013615584/AnsiballZ_stat.py && sleep 0' 10836 1726773086.28474: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 142606531, "dev": 51713, "nlink": 1, "atime": 1726773071.2061307, "mtime": 1726773065.4191086, "ctime": 1726773065.4191086, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "2407425296", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 10836 1726773086.29691: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10836 1726773086.29737: stderr chunk (state=3): >>><<< 10836 1726773086.29744: stdout chunk (state=3): >>><<< 10836 1726773086.29759: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 142606531, "dev": 51713, "nlink": 1, "atime": 1726773071.2061307, "mtime": 1726773065.4191086, "ctime": 1726773065.4191086, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "2407425296", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.47.99 closed. 10836 1726773086.29802: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773086.0275621-10836-177124013615584/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10836 1726773086.29840: variable 'ansible_module_compression' from source: unknown 10836 1726773086.29872: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 10836 1726773086.29895: variable 'ansible_facts' from source: unknown 10836 1726773086.29952: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773086.0275621-10836-177124013615584/AnsiballZ_file.py 10836 1726773086.30058: Sending initial data 10836 1726773086.30065: Sent initial data (152 bytes) 10836 1726773086.33046: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpbd2y_1ry /root/.ansible/tmp/ansible-tmp-1726773086.0275621-10836-177124013615584/AnsiballZ_file.py <<< 10836 1726773086.34992: stderr chunk (state=3): >>><<< 10836 1726773086.35005: stdout chunk (state=3): >>><<< 10836 1726773086.35032: done transferring module to remote 10836 1726773086.35042: _low_level_execute_command(): starting 10836 1726773086.35051: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773086.0275621-10836-177124013615584/ /root/.ansible/tmp/ansible-tmp-1726773086.0275621-10836-177124013615584/AnsiballZ_file.py && sleep 0' 10836 1726773086.37955: stderr chunk (state=2): >>><<< 10836 1726773086.37964: stdout chunk (state=2): >>><<< 10836 1726773086.37980: _low_level_execute_command() done: rc=0, stdout=, stderr= 10836 1726773086.37984: _low_level_execute_command(): starting 10836 1726773086.37991: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773086.0275621-10836-177124013615584/AnsiballZ_file.py && sleep 0' 10836 1726773086.54334: stdout chunk (state=2): >>> {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmp677h1vq9", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 10836 1726773086.55512: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10836 1726773086.55561: stderr chunk (state=3): >>><<< 10836 1726773086.55568: stdout chunk (state=3): >>><<< 10836 1726773086.55586: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmp677h1vq9", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.47.99 closed. 10836 1726773086.55612: done with _execute_module (ansible.legacy.file, {'mode': '0600', 'dest': '/etc/tuned/active_profile', '_original_basename': 'tmp677h1vq9', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773086.0275621-10836-177124013615584/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10836 1726773086.55622: _low_level_execute_command(): starting 10836 1726773086.55626: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773086.0275621-10836-177124013615584/ > /dev/null 2>&1 && sleep 0' 10836 1726773086.58129: stderr chunk (state=2): >>><<< 10836 1726773086.58142: stdout chunk (state=2): >>><<< 10836 1726773086.58158: _low_level_execute_command() done: rc=0, stdout=, stderr= 10836 1726773086.58167: handler run complete 10836 1726773086.58189: attempt loop complete, returning result 10836 1726773086.58193: _execute() done 10836 1726773086.58197: dumping result to json 10836 1726773086.58204: done dumping result, returning 10836 1726773086.58212: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile [0affffe7-6841-7dd6-8fa6-0000000006e7] 10836 1726773086.58218: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006e7 10836 1726773086.58254: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006e7 10836 1726773086.58258: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/active_profile", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "state": "file", "uid": 0 } 9733 1726773086.58426: no more pending results, returning what we have 9733 1726773086.58430: results queue empty 9733 1726773086.58430: checking for any_errors_fatal 9733 1726773086.58436: done checking for any_errors_fatal 9733 1726773086.58436: checking for max_fail_percentage 9733 1726773086.58438: done checking for max_fail_percentage 9733 1726773086.58438: checking to see if all hosts have failed and the running result is not ok 9733 1726773086.58439: done checking to see if all hosts have failed 9733 1726773086.58439: getting the remaining hosts for this loop 9733 1726773086.58440: done getting the remaining hosts for this loop 9733 1726773086.58443: getting the next task for host managed_node3 9733 1726773086.58450: done getting next task for host managed_node3 9733 1726773086.58453: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 9733 1726773086.58457: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773086.58466: getting variables 9733 1726773086.58468: in VariableManager get_vars() 9733 1726773086.58504: Calling all_inventory to load vars for managed_node3 9733 1726773086.58506: Calling groups_inventory to load vars for managed_node3 9733 1726773086.58508: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773086.58517: Calling all_plugins_play to load vars for managed_node3 9733 1726773086.58520: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773086.58522: Calling groups_plugins_play to load vars for managed_node3 9733 1726773086.58675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773086.58800: done with get_vars() 9733 1726773086.58809: done getting variables 9733 1726773086.58850: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set profile_mode to manual] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 Thursday 19 September 2024 15:11:26 -0400 (0:00:00.602) 0:00:32.321 **** 9733 1726773086.58873: entering _queue_task() for managed_node3/copy 9733 1726773086.59043: worker is 1 (out of 1 available) 9733 1726773086.59059: exiting _queue_task() for managed_node3/copy 9733 1726773086.59071: done queuing things up, now waiting for results queue to drain 9733 1726773086.59073: waiting for pending results... 10860 1726773086.59215: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 10860 1726773086.59347: in run() - task 0affffe7-6841-7dd6-8fa6-0000000006e8 10860 1726773086.59364: variable 'ansible_search_path' from source: unknown 10860 1726773086.59367: variable 'ansible_search_path' from source: unknown 10860 1726773086.59398: calling self._execute() 10860 1726773086.59472: variable 'ansible_host' from source: host vars for 'managed_node3' 10860 1726773086.59480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10860 1726773086.59487: variable 'omit' from source: magic vars 10860 1726773086.59567: variable 'omit' from source: magic vars 10860 1726773086.59613: variable 'omit' from source: magic vars 10860 1726773086.59638: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 10860 1726773086.59868: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 10860 1726773086.59931: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10860 1726773086.59961: variable 'omit' from source: magic vars 10860 1726773086.59995: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10860 1726773086.60024: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10860 1726773086.60041: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10860 1726773086.60055: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10860 1726773086.60068: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10860 1726773086.60094: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10860 1726773086.60099: variable 'ansible_host' from source: host vars for 'managed_node3' 10860 1726773086.60106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10860 1726773086.60173: Set connection var ansible_timeout to 10 10860 1726773086.60179: Set connection var ansible_shell_type to sh 10860 1726773086.60186: Set connection var ansible_module_compression to ZIP_DEFLATED 10860 1726773086.60192: Set connection var ansible_shell_executable to /bin/sh 10860 1726773086.60198: Set connection var ansible_pipelining to False 10860 1726773086.60206: Set connection var ansible_connection to ssh 10860 1726773086.60222: variable 'ansible_shell_executable' from source: unknown 10860 1726773086.60226: variable 'ansible_connection' from source: unknown 10860 1726773086.60229: variable 'ansible_module_compression' from source: unknown 10860 1726773086.60232: variable 'ansible_shell_type' from source: unknown 10860 1726773086.60236: variable 'ansible_shell_executable' from source: unknown 10860 1726773086.60239: variable 'ansible_host' from source: host vars for 'managed_node3' 10860 1726773086.60243: variable 'ansible_pipelining' from source: unknown 10860 1726773086.60246: variable 'ansible_timeout' from source: unknown 10860 1726773086.60251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10860 1726773086.60346: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10860 1726773086.60358: variable 'omit' from source: magic vars 10860 1726773086.60364: starting attempt loop 10860 1726773086.60368: running the handler 10860 1726773086.60379: _low_level_execute_command(): starting 10860 1726773086.60388: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10860 1726773086.62783: stdout chunk (state=2): >>>/root <<< 10860 1726773086.62904: stderr chunk (state=3): >>><<< 10860 1726773086.62912: stdout chunk (state=3): >>><<< 10860 1726773086.62931: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10860 1726773086.62944: _low_level_execute_command(): starting 10860 1726773086.62953: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773086.629391-10860-176834505360423 `" && echo ansible-tmp-1726773086.629391-10860-176834505360423="` echo /root/.ansible/tmp/ansible-tmp-1726773086.629391-10860-176834505360423 `" ) && sleep 0' 10860 1726773086.65511: stdout chunk (state=2): >>>ansible-tmp-1726773086.629391-10860-176834505360423=/root/.ansible/tmp/ansible-tmp-1726773086.629391-10860-176834505360423 <<< 10860 1726773086.65646: stderr chunk (state=3): >>><<< 10860 1726773086.65654: stdout chunk (state=3): >>><<< 10860 1726773086.65669: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773086.629391-10860-176834505360423=/root/.ansible/tmp/ansible-tmp-1726773086.629391-10860-176834505360423 , stderr= 10860 1726773086.65746: variable 'ansible_module_compression' from source: unknown 10860 1726773086.65791: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10860 1726773086.65823: variable 'ansible_facts' from source: unknown 10860 1726773086.65893: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773086.629391-10860-176834505360423/AnsiballZ_stat.py 10860 1726773086.65984: Sending initial data 10860 1726773086.65993: Sent initial data (151 bytes) 10860 1726773086.68680: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpisf8gzxh /root/.ansible/tmp/ansible-tmp-1726773086.629391-10860-176834505360423/AnsiballZ_stat.py <<< 10860 1726773086.69866: stderr chunk (state=3): >>><<< 10860 1726773086.69875: stdout chunk (state=3): >>><<< 10860 1726773086.69895: done transferring module to remote 10860 1726773086.69908: _low_level_execute_command(): starting 10860 1726773086.69914: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773086.629391-10860-176834505360423/ /root/.ansible/tmp/ansible-tmp-1726773086.629391-10860-176834505360423/AnsiballZ_stat.py && sleep 0' 10860 1726773086.72693: stderr chunk (state=2): >>><<< 10860 1726773086.72707: stdout chunk (state=2): >>><<< 10860 1726773086.72727: _low_level_execute_command() done: rc=0, stdout=, stderr= 10860 1726773086.72732: _low_level_execute_command(): starting 10860 1726773086.72738: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773086.629391-10860-176834505360423/AnsiballZ_stat.py && sleep 0' 10860 1726773086.89307: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 148897989, "dev": 51713, "nlink": 1, "atime": 1726773071.5581322, "mtime": 1726773065.4201086, "ctime": 1726773065.4201086, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "4277482174", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 10860 1726773086.90530: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10860 1726773086.90580: stderr chunk (state=3): >>><<< 10860 1726773086.90588: stdout chunk (state=3): >>><<< 10860 1726773086.90604: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 148897989, "dev": 51713, "nlink": 1, "atime": 1726773071.5581322, "mtime": 1726773065.4201086, "ctime": 1726773065.4201086, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "4277482174", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.47.99 closed. 10860 1726773086.90649: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773086.629391-10860-176834505360423/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10860 1726773086.90691: variable 'ansible_module_compression' from source: unknown 10860 1726773086.90725: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 10860 1726773086.90746: variable 'ansible_facts' from source: unknown 10860 1726773086.90805: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773086.629391-10860-176834505360423/AnsiballZ_file.py 10860 1726773086.90893: Sending initial data 10860 1726773086.90900: Sent initial data (151 bytes) 10860 1726773086.93560: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpp0gnqc7b /root/.ansible/tmp/ansible-tmp-1726773086.629391-10860-176834505360423/AnsiballZ_file.py <<< 10860 1726773086.94783: stderr chunk (state=3): >>><<< 10860 1726773086.94792: stdout chunk (state=3): >>><<< 10860 1726773086.94811: done transferring module to remote 10860 1726773086.94820: _low_level_execute_command(): starting 10860 1726773086.94825: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773086.629391-10860-176834505360423/ /root/.ansible/tmp/ansible-tmp-1726773086.629391-10860-176834505360423/AnsiballZ_file.py && sleep 0' 10860 1726773086.97280: stderr chunk (state=2): >>><<< 10860 1726773086.97293: stdout chunk (state=2): >>><<< 10860 1726773086.97311: _low_level_execute_command() done: rc=0, stdout=, stderr= 10860 1726773086.97316: _low_level_execute_command(): starting 10860 1726773086.97322: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773086.629391-10860-176834505360423/AnsiballZ_file.py && sleep 0' 10860 1726773087.13614: stdout chunk (state=2): >>> {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmpt1z8zn92", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 10860 1726773087.14762: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10860 1726773087.14772: stdout chunk (state=3): >>><<< 10860 1726773087.14782: stderr chunk (state=3): >>><<< 10860 1726773087.14795: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmpt1z8zn92", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.47.99 closed. 10860 1726773087.14826: done with _execute_module (ansible.legacy.file, {'mode': '0600', 'dest': '/etc/tuned/profile_mode', '_original_basename': 'tmpt1z8zn92', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773086.629391-10860-176834505360423/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10860 1726773087.14838: _low_level_execute_command(): starting 10860 1726773087.14844: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773086.629391-10860-176834505360423/ > /dev/null 2>&1 && sleep 0' 10860 1726773087.17337: stderr chunk (state=2): >>><<< 10860 1726773087.17346: stdout chunk (state=2): >>><<< 10860 1726773087.17360: _low_level_execute_command() done: rc=0, stdout=, stderr= 10860 1726773087.17369: handler run complete 10860 1726773087.17391: attempt loop complete, returning result 10860 1726773087.17395: _execute() done 10860 1726773087.17399: dumping result to json 10860 1726773087.17407: done dumping result, returning 10860 1726773087.17415: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual [0affffe7-6841-7dd6-8fa6-0000000006e8] 10860 1726773087.17421: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006e8 10860 1726773087.17455: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006e8 10860 1726773087.17458: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/profile_mode", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "state": "file", "uid": 0 } 9733 1726773087.17637: no more pending results, returning what we have 9733 1726773087.17641: results queue empty 9733 1726773087.17641: checking for any_errors_fatal 9733 1726773087.17650: done checking for any_errors_fatal 9733 1726773087.17650: checking for max_fail_percentage 9733 1726773087.17652: done checking for max_fail_percentage 9733 1726773087.17652: checking to see if all hosts have failed and the running result is not ok 9733 1726773087.17653: done checking to see if all hosts have failed 9733 1726773087.17653: getting the remaining hosts for this loop 9733 1726773087.17654: done getting the remaining hosts for this loop 9733 1726773087.17657: getting the next task for host managed_node3 9733 1726773087.17663: done getting next task for host managed_node3 9733 1726773087.17666: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config 9733 1726773087.17669: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773087.17678: getting variables 9733 1726773087.17680: in VariableManager get_vars() 9733 1726773087.17716: Calling all_inventory to load vars for managed_node3 9733 1726773087.17718: Calling groups_inventory to load vars for managed_node3 9733 1726773087.17720: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773087.17729: Calling all_plugins_play to load vars for managed_node3 9733 1726773087.17730: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773087.17732: Calling groups_plugins_play to load vars for managed_node3 9733 1726773087.17843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773087.17973: done with get_vars() 9733 1726773087.17981: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get current config] ********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:107 Thursday 19 September 2024 15:11:27 -0400 (0:00:00.591) 0:00:32.912 **** 9733 1726773087.18042: entering _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config 9733 1726773087.18206: worker is 1 (out of 1 available) 9733 1726773087.18222: exiting _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config 9733 1726773087.18236: done queuing things up, now waiting for results queue to drain 9733 1726773087.18238: waiting for pending results... 10890 1726773087.18373: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get current config 10890 1726773087.18498: in run() - task 0affffe7-6841-7dd6-8fa6-0000000006e9 10890 1726773087.18516: variable 'ansible_search_path' from source: unknown 10890 1726773087.18520: variable 'ansible_search_path' from source: unknown 10890 1726773087.18561: calling self._execute() 10890 1726773087.18644: variable 'ansible_host' from source: host vars for 'managed_node3' 10890 1726773087.18652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10890 1726773087.18661: variable 'omit' from source: magic vars 10890 1726773087.18753: variable 'omit' from source: magic vars 10890 1726773087.18818: variable 'omit' from source: magic vars 10890 1726773087.18847: variable '__kernel_settings_profile_filename' from source: role '' all vars 10890 1726773087.19156: variable '__kernel_settings_profile_filename' from source: role '' all vars 10890 1726773087.19249: variable '__kernel_settings_profile_dir' from source: role '' all vars 10890 1726773087.19429: variable '__kernel_settings_profile_parent' from source: set_fact 10890 1726773087.19436: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10890 1726773087.19475: variable 'omit' from source: magic vars 10890 1726773087.19517: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10890 1726773087.19544: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10890 1726773087.19561: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10890 1726773087.19576: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10890 1726773087.19589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10890 1726773087.19616: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10890 1726773087.19621: variable 'ansible_host' from source: host vars for 'managed_node3' 10890 1726773087.19626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10890 1726773087.19698: Set connection var ansible_timeout to 10 10890 1726773087.19705: Set connection var ansible_shell_type to sh 10890 1726773087.19712: Set connection var ansible_module_compression to ZIP_DEFLATED 10890 1726773087.19717: Set connection var ansible_shell_executable to /bin/sh 10890 1726773087.19723: Set connection var ansible_pipelining to False 10890 1726773087.19728: Set connection var ansible_connection to ssh 10890 1726773087.19742: variable 'ansible_shell_executable' from source: unknown 10890 1726773087.19744: variable 'ansible_connection' from source: unknown 10890 1726773087.19746: variable 'ansible_module_compression' from source: unknown 10890 1726773087.19747: variable 'ansible_shell_type' from source: unknown 10890 1726773087.19749: variable 'ansible_shell_executable' from source: unknown 10890 1726773087.19751: variable 'ansible_host' from source: host vars for 'managed_node3' 10890 1726773087.19753: variable 'ansible_pipelining' from source: unknown 10890 1726773087.19754: variable 'ansible_timeout' from source: unknown 10890 1726773087.19756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10890 1726773087.19889: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10890 1726773087.19898: variable 'omit' from source: magic vars 10890 1726773087.19905: starting attempt loop 10890 1726773087.19908: running the handler 10890 1726773087.19917: _low_level_execute_command(): starting 10890 1726773087.19923: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10890 1726773087.22352: stdout chunk (state=2): >>>/root <<< 10890 1726773087.22512: stderr chunk (state=3): >>><<< 10890 1726773087.22519: stdout chunk (state=3): >>><<< 10890 1726773087.22538: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10890 1726773087.22551: _low_level_execute_command(): starting 10890 1726773087.22557: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773087.2254517-10890-215413231042091 `" && echo ansible-tmp-1726773087.2254517-10890-215413231042091="` echo /root/.ansible/tmp/ansible-tmp-1726773087.2254517-10890-215413231042091 `" ) && sleep 0' 10890 1726773087.25191: stdout chunk (state=2): >>>ansible-tmp-1726773087.2254517-10890-215413231042091=/root/.ansible/tmp/ansible-tmp-1726773087.2254517-10890-215413231042091 <<< 10890 1726773087.25316: stderr chunk (state=3): >>><<< 10890 1726773087.25323: stdout chunk (state=3): >>><<< 10890 1726773087.25337: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773087.2254517-10890-215413231042091=/root/.ansible/tmp/ansible-tmp-1726773087.2254517-10890-215413231042091 , stderr= 10890 1726773087.25373: variable 'ansible_module_compression' from source: unknown 10890 1726773087.25407: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 10890 1726773087.25437: variable 'ansible_facts' from source: unknown 10890 1726773087.25503: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773087.2254517-10890-215413231042091/AnsiballZ_kernel_settings_get_config.py 10890 1726773087.25600: Sending initial data 10890 1726773087.25607: Sent initial data (174 bytes) 10890 1726773087.28122: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpb0eeau8k /root/.ansible/tmp/ansible-tmp-1726773087.2254517-10890-215413231042091/AnsiballZ_kernel_settings_get_config.py <<< 10890 1726773087.29310: stderr chunk (state=3): >>><<< 10890 1726773087.29318: stdout chunk (state=3): >>><<< 10890 1726773087.29336: done transferring module to remote 10890 1726773087.29346: _low_level_execute_command(): starting 10890 1726773087.29351: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773087.2254517-10890-215413231042091/ /root/.ansible/tmp/ansible-tmp-1726773087.2254517-10890-215413231042091/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10890 1726773087.31736: stderr chunk (state=2): >>><<< 10890 1726773087.31746: stdout chunk (state=2): >>><<< 10890 1726773087.31760: _low_level_execute_command() done: rc=0, stdout=, stderr= 10890 1726773087.31764: _low_level_execute_command(): starting 10890 1726773087.31769: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773087.2254517-10890-215413231042091/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10890 1726773087.47133: stdout chunk (state=2): >>> {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "fs.file-max": "379724"}, "sysfs": {"/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0"}, "vm": {"transparent_hugepages": "madvise"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} <<< 10890 1726773087.48120: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10890 1726773087.48163: stderr chunk (state=3): >>><<< 10890 1726773087.48169: stdout chunk (state=3): >>><<< 10890 1726773087.48188: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "fs.file-max": "379724"}, "sysfs": {"/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0"}, "vm": {"transparent_hugepages": "madvise"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} , stderr=Shared connection to 10.31.47.99 closed. 10890 1726773087.48216: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773087.2254517-10890-215413231042091/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10890 1726773087.48227: _low_level_execute_command(): starting 10890 1726773087.48234: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773087.2254517-10890-215413231042091/ > /dev/null 2>&1 && sleep 0' 10890 1726773087.50675: stderr chunk (state=2): >>><<< 10890 1726773087.50687: stdout chunk (state=2): >>><<< 10890 1726773087.50703: _low_level_execute_command() done: rc=0, stdout=, stderr= 10890 1726773087.50711: handler run complete 10890 1726773087.50726: attempt loop complete, returning result 10890 1726773087.50729: _execute() done 10890 1726773087.50733: dumping result to json 10890 1726773087.50738: done dumping result, returning 10890 1726773087.50745: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get current config [0affffe7-6841-7dd6-8fa6-0000000006e9] 10890 1726773087.50751: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006e9 10890 1726773087.50780: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006e9 10890 1726773087.50784: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "data": { "main": { "summary": "kernel settings" }, "sysctl": { "fs.epoll.max_user_watches": "785592", "fs.file-max": "379724" }, "sysfs": { "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0" }, "vm": { "transparent_hugepages": "madvise" } } } 9733 1726773087.50965: no more pending results, returning what we have 9733 1726773087.50970: results queue empty 9733 1726773087.50971: checking for any_errors_fatal 9733 1726773087.50978: done checking for any_errors_fatal 9733 1726773087.50978: checking for max_fail_percentage 9733 1726773087.50980: done checking for max_fail_percentage 9733 1726773087.50981: checking to see if all hosts have failed and the running result is not ok 9733 1726773087.50981: done checking to see if all hosts have failed 9733 1726773087.50982: getting the remaining hosts for this loop 9733 1726773087.50983: done getting the remaining hosts for this loop 9733 1726773087.50987: getting the next task for host managed_node3 9733 1726773087.50994: done getting next task for host managed_node3 9733 1726773087.50997: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 9733 1726773087.51001: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773087.51011: getting variables 9733 1726773087.51012: in VariableManager get_vars() 9733 1726773087.51046: Calling all_inventory to load vars for managed_node3 9733 1726773087.51048: Calling groups_inventory to load vars for managed_node3 9733 1726773087.51049: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773087.51057: Calling all_plugins_play to load vars for managed_node3 9733 1726773087.51059: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773087.51060: Calling groups_plugins_play to load vars for managed_node3 9733 1726773087.51216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773087.51336: done with get_vars() 9733 1726773087.51344: done getting variables 9733 1726773087.51388: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Apply kernel settings] ******* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 Thursday 19 September 2024 15:11:27 -0400 (0:00:00.333) 0:00:33.246 **** 9733 1726773087.51414: entering _queue_task() for managed_node3/template 9733 1726773087.51581: worker is 1 (out of 1 available) 9733 1726773087.51599: exiting _queue_task() for managed_node3/template 9733 1726773087.51612: done queuing things up, now waiting for results queue to drain 9733 1726773087.51614: waiting for pending results... 10906 1726773087.51750: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 10906 1726773087.51875: in run() - task 0affffe7-6841-7dd6-8fa6-0000000006ea 10906 1726773087.51892: variable 'ansible_search_path' from source: unknown 10906 1726773087.51897: variable 'ansible_search_path' from source: unknown 10906 1726773087.51927: calling self._execute() 10906 1726773087.52012: variable 'ansible_host' from source: host vars for 'managed_node3' 10906 1726773087.52021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10906 1726773087.52030: variable 'omit' from source: magic vars 10906 1726773087.52131: variable 'omit' from source: magic vars 10906 1726773087.52183: variable 'omit' from source: magic vars 10906 1726773087.52504: variable '__kernel_settings_profile_src' from source: role '' all vars 10906 1726773087.52514: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10906 1726773087.52589: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10906 1726773087.52616: variable '__kernel_settings_profile_filename' from source: role '' all vars 10906 1726773087.52680: variable '__kernel_settings_profile_filename' from source: role '' all vars 10906 1726773087.52752: variable '__kernel_settings_profile_dir' from source: role '' all vars 10906 1726773087.52845: variable '__kernel_settings_profile_parent' from source: set_fact 10906 1726773087.52853: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10906 1726773087.52882: variable 'omit' from source: magic vars 10906 1726773087.52931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10906 1726773087.52965: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10906 1726773087.52988: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10906 1726773087.53007: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10906 1726773087.53024: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10906 1726773087.53055: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10906 1726773087.53063: variable 'ansible_host' from source: host vars for 'managed_node3' 10906 1726773087.53067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10906 1726773087.53170: Set connection var ansible_timeout to 10 10906 1726773087.53176: Set connection var ansible_shell_type to sh 10906 1726773087.53184: Set connection var ansible_module_compression to ZIP_DEFLATED 10906 1726773087.53191: Set connection var ansible_shell_executable to /bin/sh 10906 1726773087.53197: Set connection var ansible_pipelining to False 10906 1726773087.53205: Set connection var ansible_connection to ssh 10906 1726773087.53224: variable 'ansible_shell_executable' from source: unknown 10906 1726773087.53228: variable 'ansible_connection' from source: unknown 10906 1726773087.53230: variable 'ansible_module_compression' from source: unknown 10906 1726773087.53232: variable 'ansible_shell_type' from source: unknown 10906 1726773087.53234: variable 'ansible_shell_executable' from source: unknown 10906 1726773087.53235: variable 'ansible_host' from source: host vars for 'managed_node3' 10906 1726773087.53237: variable 'ansible_pipelining' from source: unknown 10906 1726773087.53239: variable 'ansible_timeout' from source: unknown 10906 1726773087.53243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10906 1726773087.53372: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10906 1726773087.53384: variable 'omit' from source: magic vars 10906 1726773087.53392: starting attempt loop 10906 1726773087.53396: running the handler 10906 1726773087.53408: _low_level_execute_command(): starting 10906 1726773087.53417: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10906 1726773087.55943: stdout chunk (state=2): >>>/root <<< 10906 1726773087.56062: stderr chunk (state=3): >>><<< 10906 1726773087.56069: stdout chunk (state=3): >>><<< 10906 1726773087.56090: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10906 1726773087.56105: _low_level_execute_command(): starting 10906 1726773087.56117: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773087.5609813-10906-67374685302828 `" && echo ansible-tmp-1726773087.5609813-10906-67374685302828="` echo /root/.ansible/tmp/ansible-tmp-1726773087.5609813-10906-67374685302828 `" ) && sleep 0' 10906 1726773087.58832: stdout chunk (state=2): >>>ansible-tmp-1726773087.5609813-10906-67374685302828=/root/.ansible/tmp/ansible-tmp-1726773087.5609813-10906-67374685302828 <<< 10906 1726773087.58962: stderr chunk (state=3): >>><<< 10906 1726773087.58970: stdout chunk (state=3): >>><<< 10906 1726773087.58988: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773087.5609813-10906-67374685302828=/root/.ansible/tmp/ansible-tmp-1726773087.5609813-10906-67374685302828 , stderr= 10906 1726773087.59007: evaluation_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks 10906 1726773087.59026: search_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/templates/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/templates/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/templates/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/kernel_settings.j2 10906 1726773087.59050: variable 'ansible_search_path' from source: unknown 10906 1726773087.59651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10906 1726773087.61117: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10906 1726773087.61174: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10906 1726773087.61210: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10906 1726773087.61237: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10906 1726773087.61258: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10906 1726773087.61446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10906 1726773087.61468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10906 1726773087.61491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10906 1726773087.61521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10906 1726773087.61533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10906 1726773087.61759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10906 1726773087.61777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10906 1726773087.61796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10906 1726773087.61824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10906 1726773087.61835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10906 1726773087.62079: variable 'ansible_managed' from source: unknown 10906 1726773087.62089: variable '__sections' from source: task vars 10906 1726773087.62177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10906 1726773087.62198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10906 1726773087.62218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10906 1726773087.62244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10906 1726773087.62255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10906 1726773087.62325: variable 'kernel_settings_sysctl' from source: include params 10906 1726773087.62332: variable '__kernel_settings_state_empty' from source: role '' all vars 10906 1726773087.62339: variable '__kernel_settings_previous_replaced' from source: role '' all vars 10906 1726773087.62367: variable '__sysctl_old' from source: task vars 10906 1726773087.62416: variable '__sysctl_old' from source: task vars 10906 1726773087.62555: variable 'kernel_settings_purge' from source: include params 10906 1726773087.62562: variable 'kernel_settings_sysctl' from source: include params 10906 1726773087.62567: variable '__kernel_settings_state_empty' from source: role '' all vars 10906 1726773087.62573: variable '__kernel_settings_previous_replaced' from source: role '' all vars 10906 1726773087.62577: variable '__kernel_settings_profile_contents' from source: set_fact 10906 1726773087.62708: variable 'kernel_settings_sysfs' from source: include params 10906 1726773087.62715: variable '__kernel_settings_state_empty' from source: role '' all vars 10906 1726773087.62721: variable '__kernel_settings_previous_replaced' from source: role '' all vars 10906 1726773087.62734: variable '__sysfs_old' from source: task vars 10906 1726773087.62776: variable '__sysfs_old' from source: task vars 10906 1726773087.62915: variable 'kernel_settings_purge' from source: include params 10906 1726773087.62922: variable 'kernel_settings_sysfs' from source: include params 10906 1726773087.62927: variable '__kernel_settings_state_empty' from source: role '' all vars 10906 1726773087.62932: variable '__kernel_settings_previous_replaced' from source: role '' all vars 10906 1726773087.62937: variable '__kernel_settings_profile_contents' from source: set_fact 10906 1726773087.62952: variable 'kernel_settings_systemd_cpu_affinity' from source: include params 10906 1726773087.62960: variable '__systemd_old' from source: task vars 10906 1726773087.63007: variable '__systemd_old' from source: task vars 10906 1726773087.63138: variable 'kernel_settings_purge' from source: include params 10906 1726773087.63144: variable 'kernel_settings_systemd_cpu_affinity' from source: include params 10906 1726773087.63149: variable '__kernel_settings_state_absent' from source: role '' all vars 10906 1726773087.63155: variable '__kernel_settings_profile_contents' from source: set_fact 10906 1726773087.63166: variable 'kernel_settings_transparent_hugepages' from source: include params 10906 1726773087.63171: variable 'kernel_settings_transparent_hugepages_defrag' from source: include params 10906 1726773087.63176: variable '__trans_huge_old' from source: task vars 10906 1726773087.63220: variable '__trans_huge_old' from source: task vars 10906 1726773087.63350: variable 'kernel_settings_purge' from source: include params 10906 1726773087.63357: variable 'kernel_settings_transparent_hugepages' from source: include params 10906 1726773087.63362: variable '__kernel_settings_state_absent' from source: role '' all vars 10906 1726773087.63368: variable '__kernel_settings_profile_contents' from source: set_fact 10906 1726773087.63378: variable '__trans_defrag_old' from source: task vars 10906 1726773087.63423: variable '__trans_defrag_old' from source: task vars 10906 1726773087.63551: variable 'kernel_settings_purge' from source: include params 10906 1726773087.63557: variable 'kernel_settings_transparent_hugepages_defrag' from source: include params 10906 1726773087.63562: variable '__kernel_settings_state_absent' from source: role '' all vars 10906 1726773087.63567: variable '__kernel_settings_profile_contents' from source: set_fact 10906 1726773087.63587: variable '__kernel_settings_state_absent' from source: role '' all vars 10906 1726773087.63598: variable '__kernel_settings_state_absent' from source: role '' all vars 10906 1726773087.63607: variable '__kernel_settings_state_absent' from source: role '' all vars 10906 1726773087.64223: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10906 1726773087.64265: variable 'ansible_module_compression' from source: unknown 10906 1726773087.64309: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10906 1726773087.64329: variable 'ansible_facts' from source: unknown 10906 1726773087.64390: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773087.5609813-10906-67374685302828/AnsiballZ_stat.py 10906 1726773087.64481: Sending initial data 10906 1726773087.64490: Sent initial data (151 bytes) 10906 1726773087.67134: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpkn8wkye2 /root/.ansible/tmp/ansible-tmp-1726773087.5609813-10906-67374685302828/AnsiballZ_stat.py <<< 10906 1726773087.68364: stderr chunk (state=3): >>><<< 10906 1726773087.68374: stdout chunk (state=3): >>><<< 10906 1726773087.68395: done transferring module to remote 10906 1726773087.68407: _low_level_execute_command(): starting 10906 1726773087.68413: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773087.5609813-10906-67374685302828/ /root/.ansible/tmp/ansible-tmp-1726773087.5609813-10906-67374685302828/AnsiballZ_stat.py && sleep 0' 10906 1726773087.70805: stderr chunk (state=2): >>><<< 10906 1726773087.70815: stdout chunk (state=2): >>><<< 10906 1726773087.70832: _low_level_execute_command() done: rc=0, stdout=, stderr= 10906 1726773087.70836: _low_level_execute_command(): starting 10906 1726773087.70842: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773087.5609813-10906-67374685302828/AnsiballZ_stat.py && sleep 0' 10906 1726773087.86632: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 312, "inode": 159383749, "dev": 51713, "nlink": 1, "atime": 1726773065.3961084, "mtime": 1726773064.4551048, "ctime": 1726773064.7141058, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "ba15904bb90578344fad097ce2f46f9231275eae", "mimetype": "text/plain", "charset": "us-ascii", "version": "1834219966", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 10906 1726773087.87724: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10906 1726773087.87776: stderr chunk (state=3): >>><<< 10906 1726773087.87783: stdout chunk (state=3): >>><<< 10906 1726773087.87802: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 312, "inode": 159383749, "dev": 51713, "nlink": 1, "atime": 1726773065.3961084, "mtime": 1726773064.4551048, "ctime": 1726773064.7141058, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "ba15904bb90578344fad097ce2f46f9231275eae", "mimetype": "text/plain", "charset": "us-ascii", "version": "1834219966", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.47.99 closed. 10906 1726773087.87841: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/kernel_settings/tuned.conf', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773087.5609813-10906-67374685302828/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10906 1726773087.87931: Sending initial data 10906 1726773087.87939: Sent initial data (159 bytes) 10906 1726773087.90523: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpemgyp9lv/kernel_settings.j2 /root/.ansible/tmp/ansible-tmp-1726773087.5609813-10906-67374685302828/source <<< 10906 1726773087.90959: stderr chunk (state=3): >>><<< 10906 1726773087.90968: stdout chunk (state=3): >>><<< 10906 1726773087.90984: _low_level_execute_command(): starting 10906 1726773087.90991: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773087.5609813-10906-67374685302828/ /root/.ansible/tmp/ansible-tmp-1726773087.5609813-10906-67374685302828/source && sleep 0' 10906 1726773087.93370: stderr chunk (state=2): >>><<< 10906 1726773087.93381: stdout chunk (state=2): >>><<< 10906 1726773087.93399: _low_level_execute_command() done: rc=0, stdout=, stderr= 10906 1726773087.93423: variable 'ansible_module_compression' from source: unknown 10906 1726773087.93458: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 10906 1726773087.93480: variable 'ansible_facts' from source: unknown 10906 1726773087.93537: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773087.5609813-10906-67374685302828/AnsiballZ_copy.py 10906 1726773087.93633: Sending initial data 10906 1726773087.93640: Sent initial data (151 bytes) 10906 1726773087.96177: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpfsd6f_0s /root/.ansible/tmp/ansible-tmp-1726773087.5609813-10906-67374685302828/AnsiballZ_copy.py <<< 10906 1726773087.97712: stderr chunk (state=3): >>><<< 10906 1726773087.97727: stdout chunk (state=3): >>><<< 10906 1726773087.97748: done transferring module to remote 10906 1726773087.97757: _low_level_execute_command(): starting 10906 1726773087.97767: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773087.5609813-10906-67374685302828/ /root/.ansible/tmp/ansible-tmp-1726773087.5609813-10906-67374685302828/AnsiballZ_copy.py && sleep 0' 10906 1726773088.00178: stderr chunk (state=2): >>><<< 10906 1726773088.00191: stdout chunk (state=2): >>><<< 10906 1726773088.00210: _low_level_execute_command() done: rc=0, stdout=, stderr= 10906 1726773088.00215: _low_level_execute_command(): starting 10906 1726773088.00221: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773087.5609813-10906-67374685302828/AnsiballZ_copy.py && sleep 0' 10906 1726773088.17024: stdout chunk (state=2): >>> {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773087.5609813-10906-67374685302828/source", "md5sum": "7d83891795eeb6debeff7e2812501630", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773087.5609813-10906-67374685302828/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 10906 1726773088.18168: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10906 1726773088.18217: stderr chunk (state=3): >>><<< 10906 1726773088.18224: stdout chunk (state=3): >>><<< 10906 1726773088.18240: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773087.5609813-10906-67374685302828/source", "md5sum": "7d83891795eeb6debeff7e2812501630", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773087.5609813-10906-67374685302828/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.47.99 closed. 10906 1726773088.18268: done with _execute_module (ansible.legacy.copy, {'src': '/root/.ansible/tmp/ansible-tmp-1726773087.5609813-10906-67374685302828/source', 'dest': '/etc/tuned/kernel_settings/tuned.conf', 'mode': '0644', 'follow': False, '_original_basename': 'kernel_settings.j2', 'checksum': 'e44ba7fc7046252a1b6772f7347d0e7b9b48a069', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773087.5609813-10906-67374685302828/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10906 1726773088.18297: _low_level_execute_command(): starting 10906 1726773088.18305: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773087.5609813-10906-67374685302828/ > /dev/null 2>&1 && sleep 0' 10906 1726773088.20735: stderr chunk (state=2): >>><<< 10906 1726773088.20742: stdout chunk (state=2): >>><<< 10906 1726773088.20755: _low_level_execute_command() done: rc=0, stdout=, stderr= 10906 1726773088.20765: handler run complete 10906 1726773088.20784: attempt loop complete, returning result 10906 1726773088.20790: _execute() done 10906 1726773088.20794: dumping result to json 10906 1726773088.20800: done dumping result, returning 10906 1726773088.20811: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings [0affffe7-6841-7dd6-8fa6-0000000006ea] 10906 1726773088.20817: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006ea 10906 1726773088.20862: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006ea 10906 1726773088.20866: WORKER PROCESS EXITING changed: [managed_node3] => { "changed": true, "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "dest": "/etc/tuned/kernel_settings/tuned.conf", "gid": 0, "group": "root", "md5sum": "7d83891795eeb6debeff7e2812501630", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "src": "/root/.ansible/tmp/ansible-tmp-1726773087.5609813-10906-67374685302828/source", "state": "file", "uid": 0 } 9733 1726773088.21049: no more pending results, returning what we have 9733 1726773088.21052: results queue empty 9733 1726773088.21053: checking for any_errors_fatal 9733 1726773088.21060: done checking for any_errors_fatal 9733 1726773088.21061: checking for max_fail_percentage 9733 1726773088.21062: done checking for max_fail_percentage 9733 1726773088.21063: checking to see if all hosts have failed and the running result is not ok 9733 1726773088.21063: done checking to see if all hosts have failed 9733 1726773088.21064: getting the remaining hosts for this loop 9733 1726773088.21065: done getting the remaining hosts for this loop 9733 1726773088.21068: getting the next task for host managed_node3 9733 1726773088.21075: done getting next task for host managed_node3 9733 1726773088.21078: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 9733 1726773088.21081: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773088.21095: getting variables 9733 1726773088.21096: in VariableManager get_vars() 9733 1726773088.21129: Calling all_inventory to load vars for managed_node3 9733 1726773088.21132: Calling groups_inventory to load vars for managed_node3 9733 1726773088.21133: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773088.21143: Calling all_plugins_play to load vars for managed_node3 9733 1726773088.21145: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773088.21148: Calling groups_plugins_play to load vars for managed_node3 9733 1726773088.21261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773088.21390: done with get_vars() 9733 1726773088.21400: done getting variables 9733 1726773088.21443: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 Thursday 19 September 2024 15:11:28 -0400 (0:00:00.700) 0:00:33.947 **** 9733 1726773088.21469: entering _queue_task() for managed_node3/service 9733 1726773088.21638: worker is 1 (out of 1 available) 9733 1726773088.21655: exiting _queue_task() for managed_node3/service 9733 1726773088.21668: done queuing things up, now waiting for results queue to drain 9733 1726773088.21670: waiting for pending results... 10938 1726773088.21809: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 10938 1726773088.21942: in run() - task 0affffe7-6841-7dd6-8fa6-0000000006eb 10938 1726773088.21959: variable 'ansible_search_path' from source: unknown 10938 1726773088.21962: variable 'ansible_search_path' from source: unknown 10938 1726773088.22002: variable '__kernel_settings_services' from source: include_vars 10938 1726773088.22331: variable '__kernel_settings_services' from source: include_vars 10938 1726773088.22392: variable 'omit' from source: magic vars 10938 1726773088.22468: variable 'ansible_host' from source: host vars for 'managed_node3' 10938 1726773088.22479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10938 1726773088.22490: variable 'omit' from source: magic vars 10938 1726773088.22678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10938 1726773088.22857: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10938 1726773088.22894: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10938 1726773088.22921: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10938 1726773088.22947: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10938 1726773088.23025: variable '__kernel_settings_register_profile' from source: set_fact 10938 1726773088.23036: variable '__kernel_settings_register_mode' from source: set_fact 10938 1726773088.23053: Evaluated conditional (__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed): False 10938 1726773088.23057: when evaluation is False, skipping this task 10938 1726773088.23078: variable 'item' from source: unknown 10938 1726773088.23130: variable 'item' from source: unknown skipping: [managed_node3] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed", "item": "tuned", "skip_reason": "Conditional result was False" } 10938 1726773088.23159: dumping result to json 10938 1726773088.23164: done dumping result, returning 10938 1726773088.23171: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes [0affffe7-6841-7dd6-8fa6-0000000006eb] 10938 1726773088.23177: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006eb 10938 1726773088.23204: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006eb 10938 1726773088.23208: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false } MSG: All items skipped 9733 1726773088.23369: no more pending results, returning what we have 9733 1726773088.23372: results queue empty 9733 1726773088.23372: checking for any_errors_fatal 9733 1726773088.23387: done checking for any_errors_fatal 9733 1726773088.23388: checking for max_fail_percentage 9733 1726773088.23389: done checking for max_fail_percentage 9733 1726773088.23389: checking to see if all hosts have failed and the running result is not ok 9733 1726773088.23390: done checking to see if all hosts have failed 9733 1726773088.23391: getting the remaining hosts for this loop 9733 1726773088.23392: done getting the remaining hosts for this loop 9733 1726773088.23394: getting the next task for host managed_node3 9733 1726773088.23400: done getting next task for host managed_node3 9733 1726773088.23403: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 9733 1726773088.23406: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773088.23420: getting variables 9733 1726773088.23421: in VariableManager get_vars() 9733 1726773088.23454: Calling all_inventory to load vars for managed_node3 9733 1726773088.23456: Calling groups_inventory to load vars for managed_node3 9733 1726773088.23458: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773088.23465: Calling all_plugins_play to load vars for managed_node3 9733 1726773088.23467: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773088.23468: Calling groups_plugins_play to load vars for managed_node3 9733 1726773088.23577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773088.23702: done with get_vars() 9733 1726773088.23711: done getting variables 9733 1726773088.23751: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Tuned apply settings] ******** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 Thursday 19 September 2024 15:11:28 -0400 (0:00:00.023) 0:00:33.970 **** 9733 1726773088.23774: entering _queue_task() for managed_node3/command 9733 1726773088.23946: worker is 1 (out of 1 available) 9733 1726773088.23962: exiting _queue_task() for managed_node3/command 9733 1726773088.23976: done queuing things up, now waiting for results queue to drain 9733 1726773088.23978: waiting for pending results... 10939 1726773088.24116: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 10939 1726773088.24245: in run() - task 0affffe7-6841-7dd6-8fa6-0000000006ec 10939 1726773088.24261: variable 'ansible_search_path' from source: unknown 10939 1726773088.24265: variable 'ansible_search_path' from source: unknown 10939 1726773088.24295: calling self._execute() 10939 1726773088.24369: variable 'ansible_host' from source: host vars for 'managed_node3' 10939 1726773088.24379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10939 1726773088.24389: variable 'omit' from source: magic vars 10939 1726773088.24730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10939 1726773088.25025: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10939 1726773088.25058: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10939 1726773088.25087: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10939 1726773088.25116: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10939 1726773088.25203: variable '__kernel_settings_register_profile' from source: set_fact 10939 1726773088.25225: Evaluated conditional (not __kernel_settings_register_profile is changed): True 10939 1726773088.25318: variable '__kernel_settings_register_mode' from source: set_fact 10939 1726773088.25330: Evaluated conditional (not __kernel_settings_register_mode is changed): True 10939 1726773088.25408: variable '__kernel_settings_register_apply' from source: set_fact 10939 1726773088.25419: Evaluated conditional (__kernel_settings_register_apply is changed): True 10939 1726773088.25426: variable 'omit' from source: magic vars 10939 1726773088.25461: variable 'omit' from source: magic vars 10939 1726773088.25547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10939 1726773088.27006: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10939 1726773088.27061: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10939 1726773088.27099: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10939 1726773088.27125: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10939 1726773088.27149: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10939 1726773088.27215: variable '__kernel_settings_active_profile' from source: set_fact 10939 1726773088.27247: variable 'omit' from source: magic vars 10939 1726773088.27274: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10939 1726773088.27298: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10939 1726773088.27320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10939 1726773088.27334: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10939 1726773088.27344: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10939 1726773088.27367: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10939 1726773088.27372: variable 'ansible_host' from source: host vars for 'managed_node3' 10939 1726773088.27376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10939 1726773088.27451: Set connection var ansible_timeout to 10 10939 1726773088.27456: Set connection var ansible_shell_type to sh 10939 1726773088.27461: Set connection var ansible_module_compression to ZIP_DEFLATED 10939 1726773088.27464: Set connection var ansible_shell_executable to /bin/sh 10939 1726773088.27468: Set connection var ansible_pipelining to False 10939 1726773088.27472: Set connection var ansible_connection to ssh 10939 1726773088.27491: variable 'ansible_shell_executable' from source: unknown 10939 1726773088.27494: variable 'ansible_connection' from source: unknown 10939 1726773088.27496: variable 'ansible_module_compression' from source: unknown 10939 1726773088.27498: variable 'ansible_shell_type' from source: unknown 10939 1726773088.27502: variable 'ansible_shell_executable' from source: unknown 10939 1726773088.27504: variable 'ansible_host' from source: host vars for 'managed_node3' 10939 1726773088.27506: variable 'ansible_pipelining' from source: unknown 10939 1726773088.27507: variable 'ansible_timeout' from source: unknown 10939 1726773088.27510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10939 1726773088.27572: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10939 1726773088.27584: variable 'omit' from source: magic vars 10939 1726773088.27593: starting attempt loop 10939 1726773088.27596: running the handler 10939 1726773088.27608: _low_level_execute_command(): starting 10939 1726773088.27612: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10939 1726773088.30016: stdout chunk (state=2): >>>/root <<< 10939 1726773088.30134: stderr chunk (state=3): >>><<< 10939 1726773088.30142: stdout chunk (state=3): >>><<< 10939 1726773088.30161: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10939 1726773088.30173: _low_level_execute_command(): starting 10939 1726773088.30180: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773088.3016956-10939-171798909346989 `" && echo ansible-tmp-1726773088.3016956-10939-171798909346989="` echo /root/.ansible/tmp/ansible-tmp-1726773088.3016956-10939-171798909346989 `" ) && sleep 0' 10939 1726773088.32696: stdout chunk (state=2): >>>ansible-tmp-1726773088.3016956-10939-171798909346989=/root/.ansible/tmp/ansible-tmp-1726773088.3016956-10939-171798909346989 <<< 10939 1726773088.32825: stderr chunk (state=3): >>><<< 10939 1726773088.32833: stdout chunk (state=3): >>><<< 10939 1726773088.32849: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773088.3016956-10939-171798909346989=/root/.ansible/tmp/ansible-tmp-1726773088.3016956-10939-171798909346989 , stderr= 10939 1726773088.32877: variable 'ansible_module_compression' from source: unknown 10939 1726773088.32919: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10939 1726773088.32948: variable 'ansible_facts' from source: unknown 10939 1726773088.33026: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773088.3016956-10939-171798909346989/AnsiballZ_command.py 10939 1726773088.33126: Sending initial data 10939 1726773088.33133: Sent initial data (155 bytes) 10939 1726773088.36119: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpwa1w2nf0 /root/.ansible/tmp/ansible-tmp-1726773088.3016956-10939-171798909346989/AnsiballZ_command.py <<< 10939 1726773088.37295: stderr chunk (state=3): >>><<< 10939 1726773088.37308: stdout chunk (state=3): >>><<< 10939 1726773088.37328: done transferring module to remote 10939 1726773088.37341: _low_level_execute_command(): starting 10939 1726773088.37347: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773088.3016956-10939-171798909346989/ /root/.ansible/tmp/ansible-tmp-1726773088.3016956-10939-171798909346989/AnsiballZ_command.py && sleep 0' 10939 1726773088.39709: stderr chunk (state=2): >>><<< 10939 1726773088.39720: stdout chunk (state=2): >>><<< 10939 1726773088.39735: _low_level_execute_command() done: rc=0, stdout=, stderr= 10939 1726773088.39740: _low_level_execute_command(): starting 10939 1726773088.39746: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773088.3016956-10939-171798909346989/AnsiballZ_command.py && sleep 0' 10939 1726773089.69544: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "profile", "virtual-guest kernel_settings"], "start": "2024-09-19 15:11:28.545371", "end": "2024-09-19 15:11:29.690177", "delta": "0:00:01.144806", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm profile 'virtual-guest kernel_settings'", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10939 1726773089.70489: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10939 1726773089.70538: stderr chunk (state=3): >>><<< 10939 1726773089.70545: stdout chunk (state=3): >>><<< 10939 1726773089.70560: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "profile", "virtual-guest kernel_settings"], "start": "2024-09-19 15:11:28.545371", "end": "2024-09-19 15:11:29.690177", "delta": "0:00:01.144806", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm profile 'virtual-guest kernel_settings'", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.47.99 closed. 10939 1726773089.70589: done with _execute_module (ansible.legacy.command, {'_raw_params': "tuned-adm profile 'virtual-guest kernel_settings'", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773088.3016956-10939-171798909346989/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10939 1726773089.70598: _low_level_execute_command(): starting 10939 1726773089.70604: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773088.3016956-10939-171798909346989/ > /dev/null 2>&1 && sleep 0' 10939 1726773089.73168: stderr chunk (state=2): >>><<< 10939 1726773089.73180: stdout chunk (state=2): >>><<< 10939 1726773089.73199: _low_level_execute_command() done: rc=0, stdout=, stderr= 10939 1726773089.73208: handler run complete 10939 1726773089.73232: Evaluated conditional (True): True 10939 1726773089.73243: attempt loop complete, returning result 10939 1726773089.73247: _execute() done 10939 1726773089.73250: dumping result to json 10939 1726773089.73255: done dumping result, returning 10939 1726773089.73263: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings [0affffe7-6841-7dd6-8fa6-0000000006ec] 10939 1726773089.73268: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006ec 10939 1726773089.73306: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006ec 10939 1726773089.73310: WORKER PROCESS EXITING changed: [managed_node3] => { "changed": true, "cmd": [ "tuned-adm", "profile", "virtual-guest kernel_settings" ], "delta": "0:00:01.144806", "end": "2024-09-19 15:11:29.690177", "rc": 0, "start": "2024-09-19 15:11:28.545371" } 9733 1726773089.73717: no more pending results, returning what we have 9733 1726773089.73719: results queue empty 9733 1726773089.73720: checking for any_errors_fatal 9733 1726773089.73727: done checking for any_errors_fatal 9733 1726773089.73727: checking for max_fail_percentage 9733 1726773089.73728: done checking for max_fail_percentage 9733 1726773089.73728: checking to see if all hosts have failed and the running result is not ok 9733 1726773089.73729: done checking to see if all hosts have failed 9733 1726773089.73729: getting the remaining hosts for this loop 9733 1726773089.73730: done getting the remaining hosts for this loop 9733 1726773089.73732: getting the next task for host managed_node3 9733 1726773089.73738: done getting next task for host managed_node3 9733 1726773089.73741: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings 9733 1726773089.73743: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773089.73750: getting variables 9733 1726773089.73751: in VariableManager get_vars() 9733 1726773089.73777: Calling all_inventory to load vars for managed_node3 9733 1726773089.73779: Calling groups_inventory to load vars for managed_node3 9733 1726773089.73780: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773089.73790: Calling all_plugins_play to load vars for managed_node3 9733 1726773089.73793: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773089.73797: Calling groups_plugins_play to load vars for managed_node3 9733 1726773089.74003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773089.74129: done with get_vars() 9733 1726773089.74139: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Verify settings] ************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:166 Thursday 19 September 2024 15:11:29 -0400 (0:00:01.504) 0:00:35.474 **** 9733 1726773089.74209: entering _queue_task() for managed_node3/include_tasks 9733 1726773089.74378: worker is 1 (out of 1 available) 9733 1726773089.74395: exiting _queue_task() for managed_node3/include_tasks 9733 1726773089.74407: done queuing things up, now waiting for results queue to drain 9733 1726773089.74409: waiting for pending results... 10984 1726773089.74538: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Verify settings 10984 1726773089.74661: in run() - task 0affffe7-6841-7dd6-8fa6-0000000006ed 10984 1726773089.74677: variable 'ansible_search_path' from source: unknown 10984 1726773089.74681: variable 'ansible_search_path' from source: unknown 10984 1726773089.74714: calling self._execute() 10984 1726773089.74789: variable 'ansible_host' from source: host vars for 'managed_node3' 10984 1726773089.74798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10984 1726773089.74809: variable 'omit' from source: magic vars 10984 1726773089.75143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10984 1726773089.75335: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10984 1726773089.75369: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10984 1726773089.75398: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10984 1726773089.75424: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10984 1726773089.75506: variable '__kernel_settings_register_apply' from source: set_fact 10984 1726773089.75529: Evaluated conditional (__kernel_settings_register_apply is changed): True 10984 1726773089.75535: _execute() done 10984 1726773089.75539: dumping result to json 10984 1726773089.75543: done dumping result, returning 10984 1726773089.75549: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Verify settings [0affffe7-6841-7dd6-8fa6-0000000006ed] 10984 1726773089.75554: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006ed 10984 1726773089.75578: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006ed 10984 1726773089.75581: WORKER PROCESS EXITING 9733 1726773089.75692: no more pending results, returning what we have 9733 1726773089.75697: in VariableManager get_vars() 9733 1726773089.75736: Calling all_inventory to load vars for managed_node3 9733 1726773089.75738: Calling groups_inventory to load vars for managed_node3 9733 1726773089.75740: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773089.75750: Calling all_plugins_play to load vars for managed_node3 9733 1726773089.75752: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773089.75754: Calling groups_plugins_play to load vars for managed_node3 9733 1726773089.75878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773089.76004: done with get_vars() 9733 1726773089.76011: variable 'ansible_search_path' from source: unknown 9733 1726773089.76012: variable 'ansible_search_path' from source: unknown 9733 1726773089.76034: we have included files to process 9733 1726773089.76035: generating all_blocks data 9733 1726773089.76039: done generating all_blocks data 9733 1726773089.76044: processing included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 9733 1726773089.76045: loading included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 9733 1726773089.76046: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml included: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml for managed_node3 9733 1726773089.76314: done processing included file 9733 1726773089.76317: iterating over new_blocks loaded from include file 9733 1726773089.76318: in VariableManager get_vars() 9733 1726773089.76337: done with get_vars() 9733 1726773089.76338: filtering new block on tags 9733 1726773089.76381: done filtering new block on tags 9733 1726773089.76383: done iterating over new_blocks loaded from include file 9733 1726773089.76384: extending task lists for all hosts with included blocks 9733 1726773089.77261: done extending task lists 9733 1726773089.77262: done processing included files 9733 1726773089.77263: results queue empty 9733 1726773089.77263: checking for any_errors_fatal 9733 1726773089.77268: done checking for any_errors_fatal 9733 1726773089.77269: checking for max_fail_percentage 9733 1726773089.77272: done checking for max_fail_percentage 9733 1726773089.77273: checking to see if all hosts have failed and the running result is not ok 9733 1726773089.77273: done checking to see if all hosts have failed 9733 1726773089.77274: getting the remaining hosts for this loop 9733 1726773089.77275: done getting the remaining hosts for this loop 9733 1726773089.77277: getting the next task for host managed_node3 9733 1726773089.77282: done getting next task for host managed_node3 9733 1726773089.77284: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 9733 1726773089.77288: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773089.77299: getting variables 9733 1726773089.77302: in VariableManager get_vars() 9733 1726773089.77315: Calling all_inventory to load vars for managed_node3 9733 1726773089.77317: Calling groups_inventory to load vars for managed_node3 9733 1726773089.77319: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773089.77323: Calling all_plugins_play to load vars for managed_node3 9733 1726773089.77325: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773089.77327: Calling groups_plugins_play to load vars for managed_node3 9733 1726773089.77466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773089.77630: done with get_vars() 9733 1726773089.77638: done getting variables 9733 1726773089.77671: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:2 Thursday 19 September 2024 15:11:29 -0400 (0:00:00.034) 0:00:35.509 **** 9733 1726773089.77705: entering _queue_task() for managed_node3/command 9733 1726773089.77923: worker is 1 (out of 1 available) 9733 1726773089.77935: exiting _queue_task() for managed_node3/command 9733 1726773089.77948: done queuing things up, now waiting for results queue to drain 9733 1726773089.77949: waiting for pending results... 10985 1726773089.78182: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 10985 1726773089.78358: in run() - task 0affffe7-6841-7dd6-8fa6-0000000007cc 10985 1726773089.78379: variable 'ansible_search_path' from source: unknown 10985 1726773089.78384: variable 'ansible_search_path' from source: unknown 10985 1726773089.78419: calling self._execute() 10985 1726773089.78509: variable 'ansible_host' from source: host vars for 'managed_node3' 10985 1726773089.78518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10985 1726773089.78527: variable 'omit' from source: magic vars 10985 1726773089.78630: variable 'omit' from source: magic vars 10985 1726773089.78688: variable 'omit' from source: magic vars 10985 1726773089.78716: variable 'omit' from source: magic vars 10985 1726773089.78749: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10985 1726773089.78774: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10985 1726773089.78794: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10985 1726773089.78817: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10985 1726773089.78834: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10985 1726773089.78860: variable 'inventory_hostname' from source: host vars for 'managed_node3' 10985 1726773089.78866: variable 'ansible_host' from source: host vars for 'managed_node3' 10985 1726773089.78870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10985 1726773089.78941: Set connection var ansible_timeout to 10 10985 1726773089.78946: Set connection var ansible_shell_type to sh 10985 1726773089.78952: Set connection var ansible_module_compression to ZIP_DEFLATED 10985 1726773089.78958: Set connection var ansible_shell_executable to /bin/sh 10985 1726773089.78962: Set connection var ansible_pipelining to False 10985 1726773089.78966: Set connection var ansible_connection to ssh 10985 1726773089.78980: variable 'ansible_shell_executable' from source: unknown 10985 1726773089.78983: variable 'ansible_connection' from source: unknown 10985 1726773089.78998: variable 'ansible_module_compression' from source: unknown 10985 1726773089.79004: variable 'ansible_shell_type' from source: unknown 10985 1726773089.79008: variable 'ansible_shell_executable' from source: unknown 10985 1726773089.79012: variable 'ansible_host' from source: host vars for 'managed_node3' 10985 1726773089.79016: variable 'ansible_pipelining' from source: unknown 10985 1726773089.79019: variable 'ansible_timeout' from source: unknown 10985 1726773089.79024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10985 1726773089.79121: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10985 1726773089.79133: variable 'omit' from source: magic vars 10985 1726773089.79140: starting attempt loop 10985 1726773089.79143: running the handler 10985 1726773089.79156: _low_level_execute_command(): starting 10985 1726773089.79164: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10985 1726773089.81556: stdout chunk (state=2): >>>/root <<< 10985 1726773089.81673: stderr chunk (state=3): >>><<< 10985 1726773089.81679: stdout chunk (state=3): >>><<< 10985 1726773089.81700: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10985 1726773089.81715: _low_level_execute_command(): starting 10985 1726773089.81722: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773089.8170996-10985-11444041895980 `" && echo ansible-tmp-1726773089.8170996-10985-11444041895980="` echo /root/.ansible/tmp/ansible-tmp-1726773089.8170996-10985-11444041895980 `" ) && sleep 0' 10985 1726773089.84228: stdout chunk (state=2): >>>ansible-tmp-1726773089.8170996-10985-11444041895980=/root/.ansible/tmp/ansible-tmp-1726773089.8170996-10985-11444041895980 <<< 10985 1726773089.84364: stderr chunk (state=3): >>><<< 10985 1726773089.84370: stdout chunk (state=3): >>><<< 10985 1726773089.84387: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773089.8170996-10985-11444041895980=/root/.ansible/tmp/ansible-tmp-1726773089.8170996-10985-11444041895980 , stderr= 10985 1726773089.84415: variable 'ansible_module_compression' from source: unknown 10985 1726773089.84461: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10985 1726773089.84493: variable 'ansible_facts' from source: unknown 10985 1726773089.84569: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773089.8170996-10985-11444041895980/AnsiballZ_command.py 10985 1726773089.84670: Sending initial data 10985 1726773089.84677: Sent initial data (154 bytes) 10985 1726773089.87225: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmp3v2xvlqi /root/.ansible/tmp/ansible-tmp-1726773089.8170996-10985-11444041895980/AnsiballZ_command.py <<< 10985 1726773089.88407: stderr chunk (state=3): >>><<< 10985 1726773089.88416: stdout chunk (state=3): >>><<< 10985 1726773089.88438: done transferring module to remote 10985 1726773089.88450: _low_level_execute_command(): starting 10985 1726773089.88455: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773089.8170996-10985-11444041895980/ /root/.ansible/tmp/ansible-tmp-1726773089.8170996-10985-11444041895980/AnsiballZ_command.py && sleep 0' 10985 1726773089.90854: stderr chunk (state=2): >>><<< 10985 1726773089.90864: stdout chunk (state=2): >>><<< 10985 1726773089.90879: _low_level_execute_command() done: rc=0, stdout=, stderr= 10985 1726773089.90883: _low_level_execute_command(): starting 10985 1726773089.90890: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773089.8170996-10985-11444041895980/AnsiballZ_command.py && sleep 0' 10985 1726773090.16739: stdout chunk (state=2): >>> {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 15:11:30.058296", "end": "2024-09-19 15:11:30.165227", "delta": "0:00:00.106931", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10985 1726773090.17951: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 10985 1726773090.18004: stderr chunk (state=3): >>><<< 10985 1726773090.18011: stdout chunk (state=3): >>><<< 10985 1726773090.18028: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 15:11:30.058296", "end": "2024-09-19 15:11:30.165227", "delta": "0:00:00.106931", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.47.99 closed. 10985 1726773090.18070: done with _execute_module (ansible.legacy.command, {'_raw_params': 'tuned-adm verify -i', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773089.8170996-10985-11444041895980/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10985 1726773090.18082: _low_level_execute_command(): starting 10985 1726773090.18089: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773089.8170996-10985-11444041895980/ > /dev/null 2>&1 && sleep 0' 10985 1726773090.20612: stderr chunk (state=2): >>><<< 10985 1726773090.20623: stdout chunk (state=2): >>><<< 10985 1726773090.20638: _low_level_execute_command() done: rc=0, stdout=, stderr= 10985 1726773090.20645: handler run complete 10985 1726773090.20663: Evaluated conditional (False): False 10985 1726773090.20673: attempt loop complete, returning result 10985 1726773090.20677: _execute() done 10985 1726773090.20680: dumping result to json 10985 1726773090.20687: done dumping result, returning 10985 1726773090.20695: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly [0affffe7-6841-7dd6-8fa6-0000000007cc] 10985 1726773090.20701: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000007cc 10985 1726773090.20735: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000007cc 10985 1726773090.20739: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "tuned-adm", "verify", "-i" ], "delta": "0:00:00.106931", "end": "2024-09-19 15:11:30.165227", "rc": 0, "start": "2024-09-19 15:11:30.058296" } STDOUT: Verification succeeded, current system settings match the preset profile. See TuneD log file ('/var/log/tuned/tuned.log') for details. 9733 1726773090.20899: no more pending results, returning what we have 9733 1726773090.20905: results queue empty 9733 1726773090.20906: checking for any_errors_fatal 9733 1726773090.20907: done checking for any_errors_fatal 9733 1726773090.20908: checking for max_fail_percentage 9733 1726773090.20909: done checking for max_fail_percentage 9733 1726773090.20910: checking to see if all hosts have failed and the running result is not ok 9733 1726773090.20911: done checking to see if all hosts have failed 9733 1726773090.20911: getting the remaining hosts for this loop 9733 1726773090.20912: done getting the remaining hosts for this loop 9733 1726773090.20915: getting the next task for host managed_node3 9733 1726773090.20921: done getting next task for host managed_node3 9733 1726773090.20924: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 9733 1726773090.20928: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773090.20938: getting variables 9733 1726773090.20940: in VariableManager get_vars() 9733 1726773090.20974: Calling all_inventory to load vars for managed_node3 9733 1726773090.20977: Calling groups_inventory to load vars for managed_node3 9733 1726773090.20978: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773090.20990: Calling all_plugins_play to load vars for managed_node3 9733 1726773090.20992: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773090.20995: Calling groups_plugins_play to load vars for managed_node3 9733 1726773090.21145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773090.21274: done with get_vars() 9733 1726773090.21282: done getting variables 9733 1726773090.21332: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Get last verify results from log] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:12 Thursday 19 September 2024 15:11:30 -0400 (0:00:00.436) 0:00:35.946 **** 9733 1726773090.21357: entering _queue_task() for managed_node3/shell 9733 1726773090.21536: worker is 1 (out of 1 available) 9733 1726773090.21552: exiting _queue_task() for managed_node3/shell 9733 1726773090.21564: done queuing things up, now waiting for results queue to drain 9733 1726773090.21566: waiting for pending results... 10998 1726773090.21699: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 10998 1726773090.21844: in run() - task 0affffe7-6841-7dd6-8fa6-0000000007cd 10998 1726773090.21861: variable 'ansible_search_path' from source: unknown 10998 1726773090.21864: variable 'ansible_search_path' from source: unknown 10998 1726773090.21894: calling self._execute() 10998 1726773090.21965: variable 'ansible_host' from source: host vars for 'managed_node3' 10998 1726773090.21974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10998 1726773090.21983: variable 'omit' from source: magic vars 10998 1726773090.22310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10998 1726773090.22490: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10998 1726773090.22525: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10998 1726773090.22553: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10998 1726773090.22580: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10998 1726773090.22660: variable '__kernel_settings_register_verify_values' from source: set_fact 10998 1726773090.22690: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 10998 1726773090.22696: when evaluation is False, skipping this task 10998 1726773090.22699: _execute() done 10998 1726773090.22702: dumping result to json 10998 1726773090.22706: done dumping result, returning 10998 1726773090.22712: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log [0affffe7-6841-7dd6-8fa6-0000000007cd] 10998 1726773090.22718: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000007cd 10998 1726773090.22739: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000007cd 10998 1726773090.22743: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 9733 1726773090.22854: no more pending results, returning what we have 9733 1726773090.22857: results queue empty 9733 1726773090.22858: checking for any_errors_fatal 9733 1726773090.22867: done checking for any_errors_fatal 9733 1726773090.22867: checking for max_fail_percentage 9733 1726773090.22869: done checking for max_fail_percentage 9733 1726773090.22869: checking to see if all hosts have failed and the running result is not ok 9733 1726773090.22870: done checking to see if all hosts have failed 9733 1726773090.22870: getting the remaining hosts for this loop 9733 1726773090.22871: done getting the remaining hosts for this loop 9733 1726773090.22874: getting the next task for host managed_node3 9733 1726773090.22881: done getting next task for host managed_node3 9733 1726773090.22884: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 9733 1726773090.22891: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773090.22908: getting variables 9733 1726773090.22909: in VariableManager get_vars() 9733 1726773090.22942: Calling all_inventory to load vars for managed_node3 9733 1726773090.22945: Calling groups_inventory to load vars for managed_node3 9733 1726773090.22947: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773090.22956: Calling all_plugins_play to load vars for managed_node3 9733 1726773090.22958: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773090.22960: Calling groups_plugins_play to load vars for managed_node3 9733 1726773090.23069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773090.23196: done with get_vars() 9733 1726773090.23207: done getting variables 9733 1726773090.23248: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:23 Thursday 19 September 2024 15:11:30 -0400 (0:00:00.019) 0:00:35.965 **** 9733 1726773090.23272: entering _queue_task() for managed_node3/fail 9733 1726773090.23443: worker is 1 (out of 1 available) 9733 1726773090.23458: exiting _queue_task() for managed_node3/fail 9733 1726773090.23470: done queuing things up, now waiting for results queue to drain 9733 1726773090.23472: waiting for pending results... 10999 1726773090.23599: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 10999 1726773090.23727: in run() - task 0affffe7-6841-7dd6-8fa6-0000000007ce 10999 1726773090.23743: variable 'ansible_search_path' from source: unknown 10999 1726773090.23746: variable 'ansible_search_path' from source: unknown 10999 1726773090.23773: calling self._execute() 10999 1726773090.23845: variable 'ansible_host' from source: host vars for 'managed_node3' 10999 1726773090.23854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 10999 1726773090.23862: variable 'omit' from source: magic vars 10999 1726773090.24191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10999 1726773090.24432: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10999 1726773090.24466: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10999 1726773090.24506: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10999 1726773090.24533: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10999 1726773090.24614: variable '__kernel_settings_register_verify_values' from source: set_fact 10999 1726773090.24637: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 10999 1726773090.24642: when evaluation is False, skipping this task 10999 1726773090.24645: _execute() done 10999 1726773090.24649: dumping result to json 10999 1726773090.24653: done dumping result, returning 10999 1726773090.24659: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors [0affffe7-6841-7dd6-8fa6-0000000007ce] 10999 1726773090.24665: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000007ce 10999 1726773090.24691: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000007ce 10999 1726773090.24695: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 9733 1726773090.24820: no more pending results, returning what we have 9733 1726773090.24823: results queue empty 9733 1726773090.24823: checking for any_errors_fatal 9733 1726773090.24828: done checking for any_errors_fatal 9733 1726773090.24828: checking for max_fail_percentage 9733 1726773090.24830: done checking for max_fail_percentage 9733 1726773090.24830: checking to see if all hosts have failed and the running result is not ok 9733 1726773090.24831: done checking to see if all hosts have failed 9733 1726773090.24831: getting the remaining hosts for this loop 9733 1726773090.24832: done getting the remaining hosts for this loop 9733 1726773090.24835: getting the next task for host managed_node3 9733 1726773090.24844: done getting next task for host managed_node3 9733 1726773090.24847: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 9733 1726773090.24851: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773090.24865: getting variables 9733 1726773090.24867: in VariableManager get_vars() 9733 1726773090.24899: Calling all_inventory to load vars for managed_node3 9733 1726773090.24904: Calling groups_inventory to load vars for managed_node3 9733 1726773090.24906: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773090.24914: Calling all_plugins_play to load vars for managed_node3 9733 1726773090.24916: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773090.24918: Calling groups_plugins_play to load vars for managed_node3 9733 1726773090.25067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773090.25192: done with get_vars() 9733 1726773090.25199: done getting variables 9733 1726773090.25242: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:177 Thursday 19 September 2024 15:11:30 -0400 (0:00:00.019) 0:00:35.985 **** 9733 1726773090.25264: entering _queue_task() for managed_node3/set_fact 9733 1726773090.25428: worker is 1 (out of 1 available) 9733 1726773090.25440: exiting _queue_task() for managed_node3/set_fact 9733 1726773090.25452: done queuing things up, now waiting for results queue to drain 9733 1726773090.25453: waiting for pending results... 11000 1726773090.25580: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 11000 1726773090.25696: in run() - task 0affffe7-6841-7dd6-8fa6-0000000006ee 11000 1726773090.25710: variable 'ansible_search_path' from source: unknown 11000 1726773090.25713: variable 'ansible_search_path' from source: unknown 11000 1726773090.25738: calling self._execute() 11000 1726773090.25813: variable 'ansible_host' from source: host vars for 'managed_node3' 11000 1726773090.25820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11000 1726773090.25826: variable 'omit' from source: magic vars 11000 1726773090.25899: variable 'omit' from source: magic vars 11000 1726773090.25938: variable 'omit' from source: magic vars 11000 1726773090.25961: variable 'omit' from source: magic vars 11000 1726773090.25995: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726773090.26022: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726773090.26041: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726773090.26056: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726773090.26066: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726773090.26091: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11000 1726773090.26096: variable 'ansible_host' from source: host vars for 'managed_node3' 11000 1726773090.26100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11000 1726773090.26171: Set connection var ansible_timeout to 10 11000 1726773090.26176: Set connection var ansible_shell_type to sh 11000 1726773090.26182: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726773090.26189: Set connection var ansible_shell_executable to /bin/sh 11000 1726773090.26194: Set connection var ansible_pipelining to False 11000 1726773090.26201: Set connection var ansible_connection to ssh 11000 1726773090.26218: variable 'ansible_shell_executable' from source: unknown 11000 1726773090.26222: variable 'ansible_connection' from source: unknown 11000 1726773090.26226: variable 'ansible_module_compression' from source: unknown 11000 1726773090.26229: variable 'ansible_shell_type' from source: unknown 11000 1726773090.26232: variable 'ansible_shell_executable' from source: unknown 11000 1726773090.26236: variable 'ansible_host' from source: host vars for 'managed_node3' 11000 1726773090.26239: variable 'ansible_pipelining' from source: unknown 11000 1726773090.26243: variable 'ansible_timeout' from source: unknown 11000 1726773090.26246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11000 1726773090.26339: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726773090.26352: variable 'omit' from source: magic vars 11000 1726773090.26359: starting attempt loop 11000 1726773090.26363: running the handler 11000 1726773090.26373: handler run complete 11000 1726773090.26381: attempt loop complete, returning result 11000 1726773090.26384: _execute() done 11000 1726773090.26390: dumping result to json 11000 1726773090.26397: done dumping result, returning 11000 1726773090.26404: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes [0affffe7-6841-7dd6-8fa6-0000000006ee] 11000 1726773090.26410: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006ee 11000 1726773090.26429: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006ee 11000 1726773090.26433: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "kernel_settings_reboot_required": false }, "changed": false } 9733 1726773090.26551: no more pending results, returning what we have 9733 1726773090.26554: results queue empty 9733 1726773090.26554: checking for any_errors_fatal 9733 1726773090.26560: done checking for any_errors_fatal 9733 1726773090.26561: checking for max_fail_percentage 9733 1726773090.26562: done checking for max_fail_percentage 9733 1726773090.26563: checking to see if all hosts have failed and the running result is not ok 9733 1726773090.26563: done checking to see if all hosts have failed 9733 1726773090.26564: getting the remaining hosts for this loop 9733 1726773090.26565: done getting the remaining hosts for this loop 9733 1726773090.26568: getting the next task for host managed_node3 9733 1726773090.26573: done getting next task for host managed_node3 9733 1726773090.26576: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 9733 1726773090.26579: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773090.26590: getting variables 9733 1726773090.26591: in VariableManager get_vars() 9733 1726773090.26622: Calling all_inventory to load vars for managed_node3 9733 1726773090.26624: Calling groups_inventory to load vars for managed_node3 9733 1726773090.26625: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773090.26632: Calling all_plugins_play to load vars for managed_node3 9733 1726773090.26633: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773090.26635: Calling groups_plugins_play to load vars for managed_node3 9733 1726773090.26742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773090.26863: done with get_vars() 9733 1726773090.26871: done getting variables 9733 1726773090.26913: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:181 Thursday 19 September 2024 15:11:30 -0400 (0:00:00.016) 0:00:36.001 **** 9733 1726773090.26935: entering _queue_task() for managed_node3/set_fact 9733 1726773090.27090: worker is 1 (out of 1 available) 9733 1726773090.27107: exiting _queue_task() for managed_node3/set_fact 9733 1726773090.27119: done queuing things up, now waiting for results queue to drain 9733 1726773090.27121: waiting for pending results... 11001 1726773090.27244: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 11001 1726773090.27360: in run() - task 0affffe7-6841-7dd6-8fa6-0000000006ef 11001 1726773090.27376: variable 'ansible_search_path' from source: unknown 11001 1726773090.27380: variable 'ansible_search_path' from source: unknown 11001 1726773090.27410: calling self._execute() 11001 1726773090.27477: variable 'ansible_host' from source: host vars for 'managed_node3' 11001 1726773090.27487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11001 1726773090.27497: variable 'omit' from source: magic vars 11001 1726773090.27567: variable 'omit' from source: magic vars 11001 1726773090.27612: variable 'omit' from source: magic vars 11001 1726773090.27870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11001 1726773090.28110: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11001 1726773090.28143: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11001 1726773090.28168: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11001 1726773090.28196: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11001 1726773090.28293: variable '__kernel_settings_register_profile' from source: set_fact 11001 1726773090.28308: variable '__kernel_settings_register_mode' from source: set_fact 11001 1726773090.28316: variable '__kernel_settings_register_apply' from source: set_fact 11001 1726773090.28354: variable 'omit' from source: magic vars 11001 1726773090.28375: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11001 1726773090.28397: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11001 1726773090.28415: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11001 1726773090.28430: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11001 1726773090.28439: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11001 1726773090.28462: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11001 1726773090.28467: variable 'ansible_host' from source: host vars for 'managed_node3' 11001 1726773090.28471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11001 1726773090.28541: Set connection var ansible_timeout to 10 11001 1726773090.28546: Set connection var ansible_shell_type to sh 11001 1726773090.28552: Set connection var ansible_module_compression to ZIP_DEFLATED 11001 1726773090.28557: Set connection var ansible_shell_executable to /bin/sh 11001 1726773090.28562: Set connection var ansible_pipelining to False 11001 1726773090.28569: Set connection var ansible_connection to ssh 11001 1726773090.28584: variable 'ansible_shell_executable' from source: unknown 11001 1726773090.28589: variable 'ansible_connection' from source: unknown 11001 1726773090.28593: variable 'ansible_module_compression' from source: unknown 11001 1726773090.28596: variable 'ansible_shell_type' from source: unknown 11001 1726773090.28600: variable 'ansible_shell_executable' from source: unknown 11001 1726773090.28605: variable 'ansible_host' from source: host vars for 'managed_node3' 11001 1726773090.28610: variable 'ansible_pipelining' from source: unknown 11001 1726773090.28613: variable 'ansible_timeout' from source: unknown 11001 1726773090.28617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11001 1726773090.28687: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11001 1726773090.28699: variable 'omit' from source: magic vars 11001 1726773090.28708: starting attempt loop 11001 1726773090.28712: running the handler 11001 1726773090.28721: handler run complete 11001 1726773090.28728: attempt loop complete, returning result 11001 1726773090.28731: _execute() done 11001 1726773090.28734: dumping result to json 11001 1726773090.28737: done dumping result, returning 11001 1726773090.28743: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing [0affffe7-6841-7dd6-8fa6-0000000006ef] 11001 1726773090.28749: sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006ef 11001 1726773090.28766: done sending task result for task 0affffe7-6841-7dd6-8fa6-0000000006ef 11001 1726773090.28768: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__kernel_settings_changed": true }, "changed": false } 9733 1726773090.28914: no more pending results, returning what we have 9733 1726773090.28916: results queue empty 9733 1726773090.28917: checking for any_errors_fatal 9733 1726773090.28921: done checking for any_errors_fatal 9733 1726773090.28922: checking for max_fail_percentage 9733 1726773090.28924: done checking for max_fail_percentage 9733 1726773090.28924: checking to see if all hosts have failed and the running result is not ok 9733 1726773090.28925: done checking to see if all hosts have failed 9733 1726773090.28925: getting the remaining hosts for this loop 9733 1726773090.28926: done getting the remaining hosts for this loop 9733 1726773090.28929: getting the next task for host managed_node3 9733 1726773090.28937: done getting next task for host managed_node3 9733 1726773090.28939: ^ task is: TASK: meta (role_complete) 9733 1726773090.28943: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773090.28952: getting variables 9733 1726773090.28953: in VariableManager get_vars() 9733 1726773090.28982: Calling all_inventory to load vars for managed_node3 9733 1726773090.28984: Calling groups_inventory to load vars for managed_node3 9733 1726773090.28988: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773090.28995: Calling all_plugins_play to load vars for managed_node3 9733 1726773090.28997: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773090.28999: Calling groups_plugins_play to load vars for managed_node3 9733 1726773090.29104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773090.29261: done with get_vars() 9733 1726773090.29268: done getting variables 9733 1726773090.29322: done queuing things up, now waiting for results queue to drain 9733 1726773090.29327: results queue empty 9733 1726773090.29328: checking for any_errors_fatal 9733 1726773090.29330: done checking for any_errors_fatal 9733 1726773090.29331: checking for max_fail_percentage 9733 1726773090.29331: done checking for max_fail_percentage 9733 1726773090.29331: checking to see if all hosts have failed and the running result is not ok 9733 1726773090.29332: done checking to see if all hosts have failed 9733 1726773090.29332: getting the remaining hosts for this loop 9733 1726773090.29332: done getting the remaining hosts for this loop 9733 1726773090.29334: getting the next task for host managed_node3 9733 1726773090.29336: done getting next task for host managed_node3 9733 1726773090.29337: ^ task is: TASK: Verify no settings 9733 1726773090.29338: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773090.29340: getting variables 9733 1726773090.29340: in VariableManager get_vars() 9733 1726773090.29347: Calling all_inventory to load vars for managed_node3 9733 1726773090.29348: Calling groups_inventory to load vars for managed_node3 9733 1726773090.29349: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773090.29352: Calling all_plugins_play to load vars for managed_node3 9733 1726773090.29353: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773090.29355: Calling groups_plugins_play to load vars for managed_node3 9733 1726773090.29431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773090.29532: done with get_vars() 9733 1726773090.29538: done getting variables 9733 1726773090.29562: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify no settings] ****************************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:20 Thursday 19 September 2024 15:11:30 -0400 (0:00:00.026) 0:00:36.028 **** 9733 1726773090.29582: entering _queue_task() for managed_node3/shell 9733 1726773090.29736: worker is 1 (out of 1 available) 9733 1726773090.29750: exiting _queue_task() for managed_node3/shell 9733 1726773090.29761: done queuing things up, now waiting for results queue to drain 9733 1726773090.29763: waiting for pending results... 11002 1726773090.29886: running TaskExecutor() for managed_node3/TASK: Verify no settings 11002 1726773090.29987: in run() - task 0affffe7-6841-7dd6-8fa6-00000000058b 11002 1726773090.30004: variable 'ansible_search_path' from source: unknown 11002 1726773090.30009: variable 'ansible_search_path' from source: unknown 11002 1726773090.30036: calling self._execute() 11002 1726773090.30110: variable 'ansible_host' from source: host vars for 'managed_node3' 11002 1726773090.30120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11002 1726773090.30128: variable 'omit' from source: magic vars 11002 1726773090.30202: variable 'omit' from source: magic vars 11002 1726773090.30234: variable 'omit' from source: magic vars 11002 1726773090.30469: variable '__kernel_settings_profile_filename' from source: role '' exported vars 11002 1726773090.30526: variable '__kernel_settings_profile_dir' from source: role '' exported vars 11002 1726773090.30584: variable '__kernel_settings_profile_parent' from source: set_fact 11002 1726773090.30593: variable '__kernel_settings_tuned_profile' from source: role '' exported vars 11002 1726773090.30626: variable 'omit' from source: magic vars 11002 1726773090.30655: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11002 1726773090.30679: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11002 1726773090.30698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11002 1726773090.30715: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11002 1726773090.30725: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11002 1726773090.30748: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11002 1726773090.30753: variable 'ansible_host' from source: host vars for 'managed_node3' 11002 1726773090.30758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11002 1726773090.30829: Set connection var ansible_timeout to 10 11002 1726773090.30834: Set connection var ansible_shell_type to sh 11002 1726773090.30840: Set connection var ansible_module_compression to ZIP_DEFLATED 11002 1726773090.30845: Set connection var ansible_shell_executable to /bin/sh 11002 1726773090.30851: Set connection var ansible_pipelining to False 11002 1726773090.30858: Set connection var ansible_connection to ssh 11002 1726773090.30874: variable 'ansible_shell_executable' from source: unknown 11002 1726773090.30878: variable 'ansible_connection' from source: unknown 11002 1726773090.30882: variable 'ansible_module_compression' from source: unknown 11002 1726773090.30886: variable 'ansible_shell_type' from source: unknown 11002 1726773090.30889: variable 'ansible_shell_executable' from source: unknown 11002 1726773090.30893: variable 'ansible_host' from source: host vars for 'managed_node3' 11002 1726773090.30897: variable 'ansible_pipelining' from source: unknown 11002 1726773090.30902: variable 'ansible_timeout' from source: unknown 11002 1726773090.30906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11002 1726773090.30996: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11002 1726773090.31009: variable 'omit' from source: magic vars 11002 1726773090.31015: starting attempt loop 11002 1726773090.31018: running the handler 11002 1726773090.31027: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11002 1726773090.31040: _low_level_execute_command(): starting 11002 1726773090.31048: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11002 1726773090.33438: stdout chunk (state=2): >>>/root <<< 11002 1726773090.33559: stderr chunk (state=3): >>><<< 11002 1726773090.33566: stdout chunk (state=3): >>><<< 11002 1726773090.33588: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11002 1726773090.33602: _low_level_execute_command(): starting 11002 1726773090.33609: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773090.335962-11002-169404623318560 `" && echo ansible-tmp-1726773090.335962-11002-169404623318560="` echo /root/.ansible/tmp/ansible-tmp-1726773090.335962-11002-169404623318560 `" ) && sleep 0' 11002 1726773090.36136: stdout chunk (state=2): >>>ansible-tmp-1726773090.335962-11002-169404623318560=/root/.ansible/tmp/ansible-tmp-1726773090.335962-11002-169404623318560 <<< 11002 1726773090.36265: stderr chunk (state=3): >>><<< 11002 1726773090.36271: stdout chunk (state=3): >>><<< 11002 1726773090.36288: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773090.335962-11002-169404623318560=/root/.ansible/tmp/ansible-tmp-1726773090.335962-11002-169404623318560 , stderr= 11002 1726773090.36314: variable 'ansible_module_compression' from source: unknown 11002 1726773090.36359: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11002 1726773090.36389: variable 'ansible_facts' from source: unknown 11002 1726773090.36463: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773090.335962-11002-169404623318560/AnsiballZ_command.py 11002 1726773090.36622: Sending initial data 11002 1726773090.36630: Sent initial data (154 bytes) 11002 1726773090.39250: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmp93psdzbc /root/.ansible/tmp/ansible-tmp-1726773090.335962-11002-169404623318560/AnsiballZ_command.py <<< 11002 1726773090.40456: stderr chunk (state=3): >>><<< 11002 1726773090.40466: stdout chunk (state=3): >>><<< 11002 1726773090.40488: done transferring module to remote 11002 1726773090.40499: _low_level_execute_command(): starting 11002 1726773090.40507: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773090.335962-11002-169404623318560/ /root/.ansible/tmp/ansible-tmp-1726773090.335962-11002-169404623318560/AnsiballZ_command.py && sleep 0' 11002 1726773090.42941: stderr chunk (state=2): >>><<< 11002 1726773090.42950: stdout chunk (state=2): >>><<< 11002 1726773090.42965: _low_level_execute_command() done: rc=0, stdout=, stderr= 11002 1726773090.42969: _low_level_execute_command(): starting 11002 1726773090.42976: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773090.335962-11002-169404623318560/AnsiballZ_command.py && sleep 0' 11002 1726773090.59020: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ conf=/etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysctl\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysfs\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[systemd\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[vm\\]' /etc/tuned/kernel_settings/tuned.conf\n+ exit 0", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "start": "2024-09-19 15:11:30.580726", "end": "2024-09-19 15:11:30.588108", "delta": "0:00:00.007382", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11002 1726773090.60218: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 11002 1726773090.60267: stderr chunk (state=3): >>><<< 11002 1726773090.60276: stdout chunk (state=3): >>><<< 11002 1726773090.60294: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ conf=/etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysctl\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysfs\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[systemd\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[vm\\]' /etc/tuned/kernel_settings/tuned.conf\n+ exit 0", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "start": "2024-09-19 15:11:30.580726", "end": "2024-09-19 15:11:30.588108", "delta": "0:00:00.007382", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.47.99 closed. 11002 1726773090.60325: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\["$section"\\\\] "$conf"; then\n echo ERROR: "$section" settings present\n rc=1\n fi\ndone\nexit "$rc"\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773090.335962-11002-169404623318560/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11002 1726773090.60336: _low_level_execute_command(): starting 11002 1726773090.60341: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773090.335962-11002-169404623318560/ > /dev/null 2>&1 && sleep 0' 11002 1726773090.62826: stderr chunk (state=2): >>><<< 11002 1726773090.62838: stdout chunk (state=2): >>><<< 11002 1726773090.62853: _low_level_execute_command() done: rc=0, stdout=, stderr= 11002 1726773090.62860: handler run complete 11002 1726773090.62876: Evaluated conditional (False): False 11002 1726773090.62886: attempt loop complete, returning result 11002 1726773090.62890: _execute() done 11002 1726773090.62893: dumping result to json 11002 1726773090.62899: done dumping result, returning 11002 1726773090.62906: done running TaskExecutor() for managed_node3/TASK: Verify no settings [0affffe7-6841-7dd6-8fa6-00000000058b] 11002 1726773090.62911: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000058b 11002 1726773090.62940: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000058b 11002 1726773090.62943: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "delta": "0:00:00.007382", "end": "2024-09-19 15:11:30.588108", "rc": 0, "start": "2024-09-19 15:11:30.580726" } STDERR: + exec + rc=0 + conf=/etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[sysctl\]' /etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[sysfs\]' /etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[systemd\]' /etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[vm\]' /etc/tuned/kernel_settings/tuned.conf + exit 0 9733 1726773090.63240: no more pending results, returning what we have 9733 1726773090.63242: results queue empty 9733 1726773090.63243: checking for any_errors_fatal 9733 1726773090.63244: done checking for any_errors_fatal 9733 1726773090.63245: checking for max_fail_percentage 9733 1726773090.63246: done checking for max_fail_percentage 9733 1726773090.63246: checking to see if all hosts have failed and the running result is not ok 9733 1726773090.63247: done checking to see if all hosts have failed 9733 1726773090.63247: getting the remaining hosts for this loop 9733 1726773090.63248: done getting the remaining hosts for this loop 9733 1726773090.63250: getting the next task for host managed_node3 9733 1726773090.63255: done getting next task for host managed_node3 9733 1726773090.63257: ^ task is: TASK: Remove kernel_settings tuned profile 9733 1726773090.63258: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773090.63261: getting variables 9733 1726773090.63262: in VariableManager get_vars() 9733 1726773090.63295: Calling all_inventory to load vars for managed_node3 9733 1726773090.63298: Calling groups_inventory to load vars for managed_node3 9733 1726773090.63300: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773090.63310: Calling all_plugins_play to load vars for managed_node3 9733 1726773090.63316: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773090.63318: Calling groups_plugins_play to load vars for managed_node3 9733 1726773090.63428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773090.63589: done with get_vars() 9733 1726773090.63597: done getting variables TASK [Remove kernel_settings tuned profile] ************************************ task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:36 Thursday 19 September 2024 15:11:30 -0400 (0:00:00.340) 0:00:36.369 **** 9733 1726773090.63662: entering _queue_task() for managed_node3/file 9733 1726773090.63836: worker is 1 (out of 1 available) 9733 1726773090.63851: exiting _queue_task() for managed_node3/file 9733 1726773090.63863: done queuing things up, now waiting for results queue to drain 9733 1726773090.63865: waiting for pending results... 11010 1726773090.63991: running TaskExecutor() for managed_node3/TASK: Remove kernel_settings tuned profile 11010 1726773090.64099: in run() - task 0affffe7-6841-7dd6-8fa6-00000000058c 11010 1726773090.64115: variable 'ansible_search_path' from source: unknown 11010 1726773090.64118: variable 'ansible_search_path' from source: unknown 11010 1726773090.64144: calling self._execute() 11010 1726773090.64221: variable 'ansible_host' from source: host vars for 'managed_node3' 11010 1726773090.64228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11010 1726773090.64233: variable 'omit' from source: magic vars 11010 1726773090.64309: variable 'omit' from source: magic vars 11010 1726773090.64340: variable 'omit' from source: magic vars 11010 1726773090.64358: variable '__kernel_settings_profile_dir' from source: role '' exported vars 11010 1726773090.64580: variable '__kernel_settings_profile_dir' from source: role '' exported vars 11010 1726773090.64652: variable '__kernel_settings_profile_parent' from source: set_fact 11010 1726773090.64660: variable '__kernel_settings_tuned_profile' from source: role '' exported vars 11010 1726773090.64694: variable 'omit' from source: magic vars 11010 1726773090.64726: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11010 1726773090.64753: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11010 1726773090.64773: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11010 1726773090.64789: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11010 1726773090.64800: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11010 1726773090.64824: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11010 1726773090.64828: variable 'ansible_host' from source: host vars for 'managed_node3' 11010 1726773090.64833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11010 1726773090.64905: Set connection var ansible_timeout to 10 11010 1726773090.64910: Set connection var ansible_shell_type to sh 11010 1726773090.64916: Set connection var ansible_module_compression to ZIP_DEFLATED 11010 1726773090.64921: Set connection var ansible_shell_executable to /bin/sh 11010 1726773090.64927: Set connection var ansible_pipelining to False 11010 1726773090.64934: Set connection var ansible_connection to ssh 11010 1726773090.64949: variable 'ansible_shell_executable' from source: unknown 11010 1726773090.64953: variable 'ansible_connection' from source: unknown 11010 1726773090.64957: variable 'ansible_module_compression' from source: unknown 11010 1726773090.64960: variable 'ansible_shell_type' from source: unknown 11010 1726773090.64963: variable 'ansible_shell_executable' from source: unknown 11010 1726773090.64966: variable 'ansible_host' from source: host vars for 'managed_node3' 11010 1726773090.64968: variable 'ansible_pipelining' from source: unknown 11010 1726773090.64970: variable 'ansible_timeout' from source: unknown 11010 1726773090.64973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11010 1726773090.65111: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11010 1726773090.65121: variable 'omit' from source: magic vars 11010 1726773090.65128: starting attempt loop 11010 1726773090.65131: running the handler 11010 1726773090.65142: _low_level_execute_command(): starting 11010 1726773090.65149: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11010 1726773090.67504: stdout chunk (state=2): >>>/root <<< 11010 1726773090.67631: stderr chunk (state=3): >>><<< 11010 1726773090.67638: stdout chunk (state=3): >>><<< 11010 1726773090.67659: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11010 1726773090.67672: _low_level_execute_command(): starting 11010 1726773090.67679: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773090.6766722-11010-170562922328940 `" && echo ansible-tmp-1726773090.6766722-11010-170562922328940="` echo /root/.ansible/tmp/ansible-tmp-1726773090.6766722-11010-170562922328940 `" ) && sleep 0' 11010 1726773090.70151: stdout chunk (state=2): >>>ansible-tmp-1726773090.6766722-11010-170562922328940=/root/.ansible/tmp/ansible-tmp-1726773090.6766722-11010-170562922328940 <<< 11010 1726773090.70280: stderr chunk (state=3): >>><<< 11010 1726773090.70288: stdout chunk (state=3): >>><<< 11010 1726773090.70303: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773090.6766722-11010-170562922328940=/root/.ansible/tmp/ansible-tmp-1726773090.6766722-11010-170562922328940 , stderr= 11010 1726773090.70337: variable 'ansible_module_compression' from source: unknown 11010 1726773090.70381: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 11010 1726773090.70411: variable 'ansible_facts' from source: unknown 11010 1726773090.70479: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773090.6766722-11010-170562922328940/AnsiballZ_file.py 11010 1726773090.70577: Sending initial data 11010 1726773090.70584: Sent initial data (152 bytes) 11010 1726773090.73173: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmp8wl2lcsw /root/.ansible/tmp/ansible-tmp-1726773090.6766722-11010-170562922328940/AnsiballZ_file.py <<< 11010 1726773090.74382: stderr chunk (state=3): >>><<< 11010 1726773090.74393: stdout chunk (state=3): >>><<< 11010 1726773090.74415: done transferring module to remote 11010 1726773090.74427: _low_level_execute_command(): starting 11010 1726773090.74433: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773090.6766722-11010-170562922328940/ /root/.ansible/tmp/ansible-tmp-1726773090.6766722-11010-170562922328940/AnsiballZ_file.py && sleep 0' 11010 1726773090.76908: stderr chunk (state=2): >>><<< 11010 1726773090.76918: stdout chunk (state=2): >>><<< 11010 1726773090.76932: _low_level_execute_command() done: rc=0, stdout=, stderr= 11010 1726773090.76936: _low_level_execute_command(): starting 11010 1726773090.76941: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773090.6766722-11010-170562922328940/AnsiballZ_file.py && sleep 0' 11010 1726773090.92975: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "directory", "path_content": {"directories": [], "files": ["/etc/tuned/kernel_settings/tuned.conf"]}}, "after": {"path": "/etc/tuned/kernel_settings", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "absent", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 11010 1726773090.94126: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 11010 1726773090.94172: stderr chunk (state=3): >>><<< 11010 1726773090.94181: stdout chunk (state=3): >>><<< 11010 1726773090.94204: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "directory", "path_content": {"directories": [], "files": ["/etc/tuned/kernel_settings/tuned.conf"]}}, "after": {"path": "/etc/tuned/kernel_settings", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "absent", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.47.99 closed. 11010 1726773090.94238: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'absent', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773090.6766722-11010-170562922328940/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11010 1726773090.94250: _low_level_execute_command(): starting 11010 1726773090.94256: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773090.6766722-11010-170562922328940/ > /dev/null 2>&1 && sleep 0' 11010 1726773090.96756: stderr chunk (state=2): >>><<< 11010 1726773090.96768: stdout chunk (state=2): >>><<< 11010 1726773090.96787: _low_level_execute_command() done: rc=0, stdout=, stderr= 11010 1726773090.96794: handler run complete 11010 1726773090.96816: attempt loop complete, returning result 11010 1726773090.96820: _execute() done 11010 1726773090.96824: dumping result to json 11010 1726773090.96830: done dumping result, returning 11010 1726773090.96838: done running TaskExecutor() for managed_node3/TASK: Remove kernel_settings tuned profile [0affffe7-6841-7dd6-8fa6-00000000058c] 11010 1726773090.96845: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000058c 11010 1726773090.96877: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000058c 11010 1726773090.96880: WORKER PROCESS EXITING changed: [managed_node3] => { "changed": true, "path": "/etc/tuned/kernel_settings", "state": "absent" } 9733 1726773090.97137: no more pending results, returning what we have 9733 1726773090.97140: results queue empty 9733 1726773090.97140: checking for any_errors_fatal 9733 1726773090.97145: done checking for any_errors_fatal 9733 1726773090.97146: checking for max_fail_percentage 9733 1726773090.97147: done checking for max_fail_percentage 9733 1726773090.97147: checking to see if all hosts have failed and the running result is not ok 9733 1726773090.97148: done checking to see if all hosts have failed 9733 1726773090.97148: getting the remaining hosts for this loop 9733 1726773090.97149: done getting the remaining hosts for this loop 9733 1726773090.97151: getting the next task for host managed_node3 9733 1726773090.97156: done getting next task for host managed_node3 9733 1726773090.97157: ^ task is: TASK: Get active_profile 9733 1726773090.97159: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773090.97162: getting variables 9733 1726773090.97163: in VariableManager get_vars() 9733 1726773090.97194: Calling all_inventory to load vars for managed_node3 9733 1726773090.97196: Calling groups_inventory to load vars for managed_node3 9733 1726773090.97198: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773090.97208: Calling all_plugins_play to load vars for managed_node3 9733 1726773090.97210: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773090.97211: Calling groups_plugins_play to load vars for managed_node3 9733 1726773090.97325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773090.97444: done with get_vars() 9733 1726773090.97453: done getting variables TASK [Get active_profile] ****************************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:41 Thursday 19 September 2024 15:11:30 -0400 (0:00:00.338) 0:00:36.707 **** 9733 1726773090.97525: entering _queue_task() for managed_node3/slurp 9733 1726773090.97707: worker is 1 (out of 1 available) 9733 1726773090.97720: exiting _queue_task() for managed_node3/slurp 9733 1726773090.97732: done queuing things up, now waiting for results queue to drain 9733 1726773090.97733: waiting for pending results... 11018 1726773090.97861: running TaskExecutor() for managed_node3/TASK: Get active_profile 11018 1726773090.97967: in run() - task 0affffe7-6841-7dd6-8fa6-00000000058d 11018 1726773090.97986: variable 'ansible_search_path' from source: unknown 11018 1726773090.97991: variable 'ansible_search_path' from source: unknown 11018 1726773090.98021: calling self._execute() 11018 1726773090.98100: variable 'ansible_host' from source: host vars for 'managed_node3' 11018 1726773090.98108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11018 1726773090.98117: variable 'omit' from source: magic vars 11018 1726773090.98196: variable 'omit' from source: magic vars 11018 1726773090.98228: variable 'omit' from source: magic vars 11018 1726773090.98249: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars 11018 1726773090.98467: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars 11018 1726773090.98530: variable '__kernel_settings_tuned_dir' from source: role '' exported vars 11018 1726773090.98561: variable 'omit' from source: magic vars 11018 1726773090.98594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11018 1726773090.98623: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11018 1726773090.98641: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11018 1726773090.98713: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11018 1726773090.98724: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11018 1726773090.98745: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11018 1726773090.98749: variable 'ansible_host' from source: host vars for 'managed_node3' 11018 1726773090.98751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11018 1726773090.98819: Set connection var ansible_timeout to 10 11018 1726773090.98824: Set connection var ansible_shell_type to sh 11018 1726773090.98829: Set connection var ansible_module_compression to ZIP_DEFLATED 11018 1726773090.98832: Set connection var ansible_shell_executable to /bin/sh 11018 1726773090.98836: Set connection var ansible_pipelining to False 11018 1726773090.98840: Set connection var ansible_connection to ssh 11018 1726773090.98853: variable 'ansible_shell_executable' from source: unknown 11018 1726773090.98856: variable 'ansible_connection' from source: unknown 11018 1726773090.98857: variable 'ansible_module_compression' from source: unknown 11018 1726773090.98859: variable 'ansible_shell_type' from source: unknown 11018 1726773090.98861: variable 'ansible_shell_executable' from source: unknown 11018 1726773090.98862: variable 'ansible_host' from source: host vars for 'managed_node3' 11018 1726773090.98864: variable 'ansible_pipelining' from source: unknown 11018 1726773090.98866: variable 'ansible_timeout' from source: unknown 11018 1726773090.98868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11018 1726773090.99011: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11018 1726773090.99023: variable 'omit' from source: magic vars 11018 1726773090.99029: starting attempt loop 11018 1726773090.99032: running the handler 11018 1726773090.99045: _low_level_execute_command(): starting 11018 1726773090.99053: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11018 1726773091.01446: stdout chunk (state=2): >>>/root <<< 11018 1726773091.01566: stderr chunk (state=3): >>><<< 11018 1726773091.01574: stdout chunk (state=3): >>><<< 11018 1726773091.01594: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11018 1726773091.01609: _low_level_execute_command(): starting 11018 1726773091.01616: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773091.016047-11018-131633530484160 `" && echo ansible-tmp-1726773091.016047-11018-131633530484160="` echo /root/.ansible/tmp/ansible-tmp-1726773091.016047-11018-131633530484160 `" ) && sleep 0' 11018 1726773091.04172: stdout chunk (state=2): >>>ansible-tmp-1726773091.016047-11018-131633530484160=/root/.ansible/tmp/ansible-tmp-1726773091.016047-11018-131633530484160 <<< 11018 1726773091.04306: stderr chunk (state=3): >>><<< 11018 1726773091.04313: stdout chunk (state=3): >>><<< 11018 1726773091.04328: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773091.016047-11018-131633530484160=/root/.ansible/tmp/ansible-tmp-1726773091.016047-11018-131633530484160 , stderr= 11018 1726773091.04364: variable 'ansible_module_compression' from source: unknown 11018 1726773091.04398: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.slurp-ZIP_DEFLATED 11018 1726773091.04428: variable 'ansible_facts' from source: unknown 11018 1726773091.04498: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773091.016047-11018-131633530484160/AnsiballZ_slurp.py 11018 1726773091.04597: Sending initial data 11018 1726773091.04604: Sent initial data (152 bytes) 11018 1726773091.07274: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmp242xmxy0 /root/.ansible/tmp/ansible-tmp-1726773091.016047-11018-131633530484160/AnsiballZ_slurp.py <<< 11018 1726773091.08433: stderr chunk (state=3): >>><<< 11018 1726773091.08445: stdout chunk (state=3): >>><<< 11018 1726773091.08464: done transferring module to remote 11018 1726773091.08475: _low_level_execute_command(): starting 11018 1726773091.08480: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773091.016047-11018-131633530484160/ /root/.ansible/tmp/ansible-tmp-1726773091.016047-11018-131633530484160/AnsiballZ_slurp.py && sleep 0' 11018 1726773091.11019: stderr chunk (state=2): >>><<< 11018 1726773091.11030: stdout chunk (state=2): >>><<< 11018 1726773091.11049: _low_level_execute_command() done: rc=0, stdout=, stderr= 11018 1726773091.11054: _low_level_execute_command(): starting 11018 1726773091.11059: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773091.016047-11018-131633530484160/AnsiballZ_slurp.py && sleep 0' 11018 1726773091.26631: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 11018 1726773091.27825: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 11018 1726773091.27836: stdout chunk (state=3): >>><<< 11018 1726773091.27847: stderr chunk (state=3): >>><<< 11018 1726773091.27859: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.47.99 closed. 11018 1726773091.27891: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773091.016047-11018-131633530484160/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11018 1726773091.27904: _low_level_execute_command(): starting 11018 1726773091.27911: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773091.016047-11018-131633530484160/ > /dev/null 2>&1 && sleep 0' 11018 1726773091.30628: stderr chunk (state=2): >>><<< 11018 1726773091.30636: stdout chunk (state=2): >>><<< 11018 1726773091.30652: _low_level_execute_command() done: rc=0, stdout=, stderr= 11018 1726773091.30659: handler run complete 11018 1726773091.30672: attempt loop complete, returning result 11018 1726773091.30676: _execute() done 11018 1726773091.30680: dumping result to json 11018 1726773091.30684: done dumping result, returning 11018 1726773091.30693: done running TaskExecutor() for managed_node3/TASK: Get active_profile [0affffe7-6841-7dd6-8fa6-00000000058d] 11018 1726773091.30702: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000058d 11018 1726773091.30739: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000058d 11018 1726773091.30742: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "encoding": "base64", "source": "/etc/tuned/active_profile" } 9733 1726773091.31093: no more pending results, returning what we have 9733 1726773091.31097: results queue empty 9733 1726773091.31097: checking for any_errors_fatal 9733 1726773091.31105: done checking for any_errors_fatal 9733 1726773091.31106: checking for max_fail_percentage 9733 1726773091.31107: done checking for max_fail_percentage 9733 1726773091.31108: checking to see if all hosts have failed and the running result is not ok 9733 1726773091.31109: done checking to see if all hosts have failed 9733 1726773091.31109: getting the remaining hosts for this loop 9733 1726773091.31110: done getting the remaining hosts for this loop 9733 1726773091.31113: getting the next task for host managed_node3 9733 1726773091.31119: done getting next task for host managed_node3 9733 1726773091.31121: ^ task is: TASK: Ensure kernel_settings is not in active_profile 9733 1726773091.31124: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773091.31127: getting variables 9733 1726773091.31129: in VariableManager get_vars() 9733 1726773091.31166: Calling all_inventory to load vars for managed_node3 9733 1726773091.31169: Calling groups_inventory to load vars for managed_node3 9733 1726773091.31171: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773091.31187: Calling all_plugins_play to load vars for managed_node3 9733 1726773091.31191: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773091.31194: Calling groups_plugins_play to load vars for managed_node3 9733 1726773091.31362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773091.31628: done with get_vars() 9733 1726773091.31638: done getting variables 9733 1726773091.31703: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure kernel_settings is not in active_profile] ************************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:46 Thursday 19 September 2024 15:11:31 -0400 (0:00:00.342) 0:00:37.049 **** 9733 1726773091.31728: entering _queue_task() for managed_node3/copy 9733 1726773091.31901: worker is 1 (out of 1 available) 9733 1726773091.31916: exiting _queue_task() for managed_node3/copy 9733 1726773091.31929: done queuing things up, now waiting for results queue to drain 9733 1726773091.31931: waiting for pending results... 11036 1726773091.32064: running TaskExecutor() for managed_node3/TASK: Ensure kernel_settings is not in active_profile 11036 1726773091.32171: in run() - task 0affffe7-6841-7dd6-8fa6-00000000058e 11036 1726773091.32186: variable 'ansible_search_path' from source: unknown 11036 1726773091.32191: variable 'ansible_search_path' from source: unknown 11036 1726773091.32224: calling self._execute() 11036 1726773091.32309: variable 'ansible_host' from source: host vars for 'managed_node3' 11036 1726773091.32320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11036 1726773091.32332: variable 'omit' from source: magic vars 11036 1726773091.32420: variable 'omit' from source: magic vars 11036 1726773091.32451: variable 'omit' from source: magic vars 11036 1726773091.32469: variable '__active_profile' from source: task vars 11036 1726773091.32694: variable '__active_profile' from source: task vars 11036 1726773091.32836: variable '__cur_profile' from source: task vars 11036 1726773091.33015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11036 1726773091.34998: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11036 1726773091.35044: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11036 1726773091.35073: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11036 1726773091.35115: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11036 1726773091.35136: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11036 1726773091.35193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11036 1726773091.35216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11036 1726773091.35232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11036 1726773091.35256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11036 1726773091.35264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11036 1726773091.35349: variable '__kernel_settings_tuned_current_profile' from source: set_fact 11036 1726773091.35392: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars 11036 1726773091.35444: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars 11036 1726773091.35495: variable '__kernel_settings_tuned_dir' from source: role '' exported vars 11036 1726773091.35517: variable 'omit' from source: magic vars 11036 1726773091.35538: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11036 1726773091.35558: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11036 1726773091.35574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11036 1726773091.35589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11036 1726773091.35599: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11036 1726773091.35624: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11036 1726773091.35628: variable 'ansible_host' from source: host vars for 'managed_node3' 11036 1726773091.35633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11036 1726773091.35699: Set connection var ansible_timeout to 10 11036 1726773091.35706: Set connection var ansible_shell_type to sh 11036 1726773091.35712: Set connection var ansible_module_compression to ZIP_DEFLATED 11036 1726773091.35717: Set connection var ansible_shell_executable to /bin/sh 11036 1726773091.35723: Set connection var ansible_pipelining to False 11036 1726773091.35730: Set connection var ansible_connection to ssh 11036 1726773091.35747: variable 'ansible_shell_executable' from source: unknown 11036 1726773091.35751: variable 'ansible_connection' from source: unknown 11036 1726773091.35754: variable 'ansible_module_compression' from source: unknown 11036 1726773091.35758: variable 'ansible_shell_type' from source: unknown 11036 1726773091.35761: variable 'ansible_shell_executable' from source: unknown 11036 1726773091.35764: variable 'ansible_host' from source: host vars for 'managed_node3' 11036 1726773091.35768: variable 'ansible_pipelining' from source: unknown 11036 1726773091.35772: variable 'ansible_timeout' from source: unknown 11036 1726773091.35776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11036 1726773091.35849: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11036 1726773091.35864: variable 'omit' from source: magic vars 11036 1726773091.35871: starting attempt loop 11036 1726773091.35874: running the handler 11036 1726773091.35887: _low_level_execute_command(): starting 11036 1726773091.35895: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11036 1726773091.38526: stdout chunk (state=2): >>>/root <<< 11036 1726773091.38645: stderr chunk (state=3): >>><<< 11036 1726773091.38652: stdout chunk (state=3): >>><<< 11036 1726773091.38668: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11036 1726773091.38678: _low_level_execute_command(): starting 11036 1726773091.38684: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773091.3867466-11036-264853969773580 `" && echo ansible-tmp-1726773091.3867466-11036-264853969773580="` echo /root/.ansible/tmp/ansible-tmp-1726773091.3867466-11036-264853969773580 `" ) && sleep 0' 11036 1726773091.41231: stdout chunk (state=2): >>>ansible-tmp-1726773091.3867466-11036-264853969773580=/root/.ansible/tmp/ansible-tmp-1726773091.3867466-11036-264853969773580 <<< 11036 1726773091.41360: stderr chunk (state=3): >>><<< 11036 1726773091.41366: stdout chunk (state=3): >>><<< 11036 1726773091.41380: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773091.3867466-11036-264853969773580=/root/.ansible/tmp/ansible-tmp-1726773091.3867466-11036-264853969773580 , stderr= 11036 1726773091.41448: variable 'ansible_module_compression' from source: unknown 11036 1726773091.41490: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11036 1726773091.41521: variable 'ansible_facts' from source: unknown 11036 1726773091.41587: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773091.3867466-11036-264853969773580/AnsiballZ_stat.py 11036 1726773091.41667: Sending initial data 11036 1726773091.41674: Sent initial data (152 bytes) 11036 1726773091.44283: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmp8u0otyl3 /root/.ansible/tmp/ansible-tmp-1726773091.3867466-11036-264853969773580/AnsiballZ_stat.py <<< 11036 1726773091.45454: stderr chunk (state=3): >>><<< 11036 1726773091.45462: stdout chunk (state=3): >>><<< 11036 1726773091.45480: done transferring module to remote 11036 1726773091.45492: _low_level_execute_command(): starting 11036 1726773091.45497: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773091.3867466-11036-264853969773580/ /root/.ansible/tmp/ansible-tmp-1726773091.3867466-11036-264853969773580/AnsiballZ_stat.py && sleep 0' 11036 1726773091.47941: stderr chunk (state=2): >>><<< 11036 1726773091.47950: stdout chunk (state=2): >>><<< 11036 1726773091.47964: _low_level_execute_command() done: rc=0, stdout=, stderr= 11036 1726773091.47969: _low_level_execute_command(): starting 11036 1726773091.47975: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773091.3867466-11036-264853969773580/AnsiballZ_stat.py && sleep 0' 11036 1726773091.64257: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 142606531, "dev": 51713, "nlink": 1, "atime": 1726773091.2642407, "mtime": 1726773088.6812248, "ctime": 1726773088.6812248, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "2407425296", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 11036 1726773091.65416: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 11036 1726773091.65473: stderr chunk (state=3): >>><<< 11036 1726773091.65481: stdout chunk (state=3): >>><<< 11036 1726773091.65501: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 142606531, "dev": 51713, "nlink": 1, "atime": 1726773091.2642407, "mtime": 1726773088.6812248, "ctime": 1726773088.6812248, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "2407425296", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.47.99 closed. 11036 1726773091.65552: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773091.3867466-11036-264853969773580/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11036 1726773091.65959: Sending initial data 11036 1726773091.65966: Sent initial data (141 bytes) 11036 1726773091.68513: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpnwjno09f /root/.ansible/tmp/ansible-tmp-1726773091.3867466-11036-264853969773580/source <<< 11036 1726773091.70164: stderr chunk (state=3): >>><<< 11036 1726773091.70173: stdout chunk (state=3): >>><<< 11036 1726773091.70194: _low_level_execute_command(): starting 11036 1726773091.70201: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773091.3867466-11036-264853969773580/ /root/.ansible/tmp/ansible-tmp-1726773091.3867466-11036-264853969773580/source && sleep 0' 11036 1726773091.72792: stderr chunk (state=2): >>><<< 11036 1726773091.72808: stdout chunk (state=2): >>><<< 11036 1726773091.72827: _low_level_execute_command() done: rc=0, stdout=, stderr= 11036 1726773091.72853: variable 'ansible_module_compression' from source: unknown 11036 1726773091.72905: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 11036 1726773091.72926: variable 'ansible_facts' from source: unknown 11036 1726773091.73008: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773091.3867466-11036-264853969773580/AnsiballZ_copy.py 11036 1726773091.73497: Sending initial data 11036 1726773091.73508: Sent initial data (152 bytes) 11036 1726773091.76388: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmp2jraf8ta /root/.ansible/tmp/ansible-tmp-1726773091.3867466-11036-264853969773580/AnsiballZ_copy.py <<< 11036 1726773091.78892: stderr chunk (state=3): >>><<< 11036 1726773091.78902: stdout chunk (state=3): >>><<< 11036 1726773091.78921: done transferring module to remote 11036 1726773091.78930: _low_level_execute_command(): starting 11036 1726773091.78935: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773091.3867466-11036-264853969773580/ /root/.ansible/tmp/ansible-tmp-1726773091.3867466-11036-264853969773580/AnsiballZ_copy.py && sleep 0' 11036 1726773091.81563: stderr chunk (state=2): >>><<< 11036 1726773091.81575: stdout chunk (state=2): >>><<< 11036 1726773091.81597: _low_level_execute_command() done: rc=0, stdout=, stderr= 11036 1726773091.81605: _low_level_execute_command(): starting 11036 1726773091.81611: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773091.3867466-11036-264853969773580/AnsiballZ_copy.py && sleep 0' 11036 1726773091.98191: stdout chunk (state=2): >>> {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726773091.3867466-11036-264853969773580/source", "md5sum": "9a561d913bcdb5a659ec2dd035975a8e", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 14, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773091.3867466-11036-264853969773580/source", "_original_basename": "tmpnwjno09f", "follow": false, "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 11036 1726773091.99316: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 11036 1726773091.99358: stderr chunk (state=3): >>><<< 11036 1726773091.99365: stdout chunk (state=3): >>><<< 11036 1726773091.99380: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726773091.3867466-11036-264853969773580/source", "md5sum": "9a561d913bcdb5a659ec2dd035975a8e", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 14, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773091.3867466-11036-264853969773580/source", "_original_basename": "tmpnwjno09f", "follow": false, "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.47.99 closed. 11036 1726773091.99407: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/active_profile', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726773091.3867466-11036-264853969773580/source', '_original_basename': 'tmpnwjno09f', 'follow': False, 'checksum': '633f07e1b5698d04352d5dca735869bf2fe77897', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773091.3867466-11036-264853969773580/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11036 1726773091.99419: _low_level_execute_command(): starting 11036 1726773091.99425: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773091.3867466-11036-264853969773580/ > /dev/null 2>&1 && sleep 0' 11036 1726773092.01837: stderr chunk (state=2): >>><<< 11036 1726773092.01845: stdout chunk (state=2): >>><<< 11036 1726773092.01860: _low_level_execute_command() done: rc=0, stdout=, stderr= 11036 1726773092.01870: handler run complete 11036 1726773092.01890: attempt loop complete, returning result 11036 1726773092.01894: _execute() done 11036 1726773092.01898: dumping result to json 11036 1726773092.01904: done dumping result, returning 11036 1726773092.01911: done running TaskExecutor() for managed_node3/TASK: Ensure kernel_settings is not in active_profile [0affffe7-6841-7dd6-8fa6-00000000058e] 11036 1726773092.01917: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000058e 11036 1726773092.01947: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000058e 11036 1726773092.01950: WORKER PROCESS EXITING changed: [managed_node3] => { "changed": true, "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "md5sum": "9a561d913bcdb5a659ec2dd035975a8e", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 14, "src": "/root/.ansible/tmp/ansible-tmp-1726773091.3867466-11036-264853969773580/source", "state": "file", "uid": 0 } 9733 1726773092.02094: no more pending results, returning what we have 9733 1726773092.02097: results queue empty 9733 1726773092.02097: checking for any_errors_fatal 9733 1726773092.02106: done checking for any_errors_fatal 9733 1726773092.02107: checking for max_fail_percentage 9733 1726773092.02108: done checking for max_fail_percentage 9733 1726773092.02109: checking to see if all hosts have failed and the running result is not ok 9733 1726773092.02110: done checking to see if all hosts have failed 9733 1726773092.02110: getting the remaining hosts for this loop 9733 1726773092.02111: done getting the remaining hosts for this loop 9733 1726773092.02114: getting the next task for host managed_node3 9733 1726773092.02119: done getting next task for host managed_node3 9733 1726773092.02122: ^ task is: TASK: Set profile_mode to auto 9733 1726773092.02124: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773092.02128: getting variables 9733 1726773092.02129: in VariableManager get_vars() 9733 1726773092.02162: Calling all_inventory to load vars for managed_node3 9733 1726773092.02164: Calling groups_inventory to load vars for managed_node3 9733 1726773092.02166: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773092.02177: Calling all_plugins_play to load vars for managed_node3 9733 1726773092.02179: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773092.02181: Calling groups_plugins_play to load vars for managed_node3 9733 1726773092.02315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773092.02435: done with get_vars() 9733 1726773092.02443: done getting variables 9733 1726773092.02489: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set profile_mode to auto] ************************************************ task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:57 Thursday 19 September 2024 15:11:32 -0400 (0:00:00.707) 0:00:37.757 **** 9733 1726773092.02513: entering _queue_task() for managed_node3/copy 9733 1726773092.02687: worker is 1 (out of 1 available) 9733 1726773092.02705: exiting _queue_task() for managed_node3/copy 9733 1726773092.02717: done queuing things up, now waiting for results queue to drain 9733 1726773092.02720: waiting for pending results... 11067 1726773092.02860: running TaskExecutor() for managed_node3/TASK: Set profile_mode to auto 11067 1726773092.02969: in run() - task 0affffe7-6841-7dd6-8fa6-00000000058f 11067 1726773092.02987: variable 'ansible_search_path' from source: unknown 11067 1726773092.02992: variable 'ansible_search_path' from source: unknown 11067 1726773092.03021: calling self._execute() 11067 1726773092.03098: variable 'ansible_host' from source: host vars for 'managed_node3' 11067 1726773092.03108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11067 1726773092.03117: variable 'omit' from source: magic vars 11067 1726773092.03193: variable 'omit' from source: magic vars 11067 1726773092.03226: variable 'omit' from source: magic vars 11067 1726773092.03248: variable '__kernel_settings_tuned_profile_mode' from source: role '' exported vars 11067 1726773092.03461: variable '__kernel_settings_tuned_profile_mode' from source: role '' exported vars 11067 1726773092.03522: variable '__kernel_settings_tuned_dir' from source: role '' exported vars 11067 1726773092.03615: variable 'omit' from source: magic vars 11067 1726773092.03649: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11067 1726773092.03674: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11067 1726773092.03695: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11067 1726773092.03710: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11067 1726773092.03720: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11067 1726773092.03744: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11067 1726773092.03749: variable 'ansible_host' from source: host vars for 'managed_node3' 11067 1726773092.03753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11067 1726773092.03825: Set connection var ansible_timeout to 10 11067 1726773092.03830: Set connection var ansible_shell_type to sh 11067 1726773092.03836: Set connection var ansible_module_compression to ZIP_DEFLATED 11067 1726773092.03842: Set connection var ansible_shell_executable to /bin/sh 11067 1726773092.03847: Set connection var ansible_pipelining to False 11067 1726773092.03854: Set connection var ansible_connection to ssh 11067 1726773092.03869: variable 'ansible_shell_executable' from source: unknown 11067 1726773092.03873: variable 'ansible_connection' from source: unknown 11067 1726773092.03876: variable 'ansible_module_compression' from source: unknown 11067 1726773092.03879: variable 'ansible_shell_type' from source: unknown 11067 1726773092.03883: variable 'ansible_shell_executable' from source: unknown 11067 1726773092.03887: variable 'ansible_host' from source: host vars for 'managed_node3' 11067 1726773092.03890: variable 'ansible_pipelining' from source: unknown 11067 1726773092.03892: variable 'ansible_timeout' from source: unknown 11067 1726773092.03894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11067 1726773092.03982: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11067 1726773092.03995: variable 'omit' from source: magic vars 11067 1726773092.04002: starting attempt loop 11067 1726773092.04006: running the handler 11067 1726773092.04017: _low_level_execute_command(): starting 11067 1726773092.04025: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11067 1726773092.06322: stdout chunk (state=2): >>>/root <<< 11067 1726773092.06440: stderr chunk (state=3): >>><<< 11067 1726773092.06447: stdout chunk (state=3): >>><<< 11067 1726773092.06466: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11067 1726773092.06480: _low_level_execute_command(): starting 11067 1726773092.06488: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773092.0647478-11067-232878397549317 `" && echo ansible-tmp-1726773092.0647478-11067-232878397549317="` echo /root/.ansible/tmp/ansible-tmp-1726773092.0647478-11067-232878397549317 `" ) && sleep 0' 11067 1726773092.08949: stdout chunk (state=2): >>>ansible-tmp-1726773092.0647478-11067-232878397549317=/root/.ansible/tmp/ansible-tmp-1726773092.0647478-11067-232878397549317 <<< 11067 1726773092.09064: stderr chunk (state=3): >>><<< 11067 1726773092.09071: stdout chunk (state=3): >>><<< 11067 1726773092.09086: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773092.0647478-11067-232878397549317=/root/.ansible/tmp/ansible-tmp-1726773092.0647478-11067-232878397549317 , stderr= 11067 1726773092.09156: variable 'ansible_module_compression' from source: unknown 11067 1726773092.09199: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11067 1726773092.09229: variable 'ansible_facts' from source: unknown 11067 1726773092.09298: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773092.0647478-11067-232878397549317/AnsiballZ_stat.py 11067 1726773092.09384: Sending initial data 11067 1726773092.09393: Sent initial data (152 bytes) 11067 1726773092.11905: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpx1jltj9w /root/.ansible/tmp/ansible-tmp-1726773092.0647478-11067-232878397549317/AnsiballZ_stat.py <<< 11067 1726773092.13623: stderr chunk (state=3): >>><<< 11067 1726773092.13631: stdout chunk (state=3): >>><<< 11067 1726773092.13650: done transferring module to remote 11067 1726773092.13660: _low_level_execute_command(): starting 11067 1726773092.13666: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773092.0647478-11067-232878397549317/ /root/.ansible/tmp/ansible-tmp-1726773092.0647478-11067-232878397549317/AnsiballZ_stat.py && sleep 0' 11067 1726773092.16093: stderr chunk (state=2): >>><<< 11067 1726773092.16101: stdout chunk (state=2): >>><<< 11067 1726773092.16115: _low_level_execute_command() done: rc=0, stdout=, stderr= 11067 1726773092.16119: _low_level_execute_command(): starting 11067 1726773092.16124: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773092.0647478-11067-232878397549317/AnsiballZ_stat.py && sleep 0' 11067 1726773092.32610: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 148897989, "dev": 51713, "nlink": 1, "atime": 1726773071.5581322, "mtime": 1726773088.6812248, "ctime": 1726773088.6812248, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "4277482174", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 11067 1726773092.33840: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 11067 1726773092.33882: stderr chunk (state=3): >>><<< 11067 1726773092.33890: stdout chunk (state=3): >>><<< 11067 1726773092.33907: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 148897989, "dev": 51713, "nlink": 1, "atime": 1726773071.5581322, "mtime": 1726773088.6812248, "ctime": 1726773088.6812248, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "4277482174", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.47.99 closed. 11067 1726773092.33951: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773092.0647478-11067-232878397549317/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11067 1726773092.34034: Sending initial data 11067 1726773092.34041: Sent initial data (141 bytes) 11067 1726773092.36686: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpfdnt7di9 /root/.ansible/tmp/ansible-tmp-1726773092.0647478-11067-232878397549317/source <<< 11067 1726773092.37126: stderr chunk (state=3): >>><<< 11067 1726773092.37133: stdout chunk (state=3): >>><<< 11067 1726773092.37153: _low_level_execute_command(): starting 11067 1726773092.37159: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773092.0647478-11067-232878397549317/ /root/.ansible/tmp/ansible-tmp-1726773092.0647478-11067-232878397549317/source && sleep 0' 11067 1726773092.39607: stderr chunk (state=2): >>><<< 11067 1726773092.39616: stdout chunk (state=2): >>><<< 11067 1726773092.39630: _low_level_execute_command() done: rc=0, stdout=, stderr= 11067 1726773092.39654: variable 'ansible_module_compression' from source: unknown 11067 1726773092.39691: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 11067 1726773092.39709: variable 'ansible_facts' from source: unknown 11067 1726773092.39767: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773092.0647478-11067-232878397549317/AnsiballZ_copy.py 11067 1726773092.39853: Sending initial data 11067 1726773092.39860: Sent initial data (152 bytes) 11067 1726773092.42459: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmpncwtlm0s /root/.ansible/tmp/ansible-tmp-1726773092.0647478-11067-232878397549317/AnsiballZ_copy.py <<< 11067 1726773092.43681: stderr chunk (state=3): >>><<< 11067 1726773092.43692: stdout chunk (state=3): >>><<< 11067 1726773092.43713: done transferring module to remote 11067 1726773092.43722: _low_level_execute_command(): starting 11067 1726773092.43728: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773092.0647478-11067-232878397549317/ /root/.ansible/tmp/ansible-tmp-1726773092.0647478-11067-232878397549317/AnsiballZ_copy.py && sleep 0' 11067 1726773092.46230: stderr chunk (state=2): >>><<< 11067 1726773092.46238: stdout chunk (state=2): >>><<< 11067 1726773092.46252: _low_level_execute_command() done: rc=0, stdout=, stderr= 11067 1726773092.46257: _low_level_execute_command(): starting 11067 1726773092.46262: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773092.0647478-11067-232878397549317/AnsiballZ_copy.py && sleep 0' 11067 1726773092.63092: stdout chunk (state=2): >>> {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726773092.0647478-11067-232878397549317/source", "md5sum": "451e20aff0f489cd2f7d4d73533aa961", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 5, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773092.0647478-11067-232878397549317/source", "_original_basename": "tmpfdnt7di9", "follow": false, "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 11067 1726773092.64348: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 11067 1726773092.64359: stdout chunk (state=3): >>><<< 11067 1726773092.64370: stderr chunk (state=3): >>><<< 11067 1726773092.64387: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726773092.0647478-11067-232878397549317/source", "md5sum": "451e20aff0f489cd2f7d4d73533aa961", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 5, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773092.0647478-11067-232878397549317/source", "_original_basename": "tmpfdnt7di9", "follow": false, "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.47.99 closed. 11067 1726773092.64423: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/profile_mode', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726773092.0647478-11067-232878397549317/source', '_original_basename': 'tmpfdnt7di9', 'follow': False, 'checksum': '43683f4e92c48be4b00ddd86e011a4f27fcdbeb5', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773092.0647478-11067-232878397549317/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11067 1726773092.64437: _low_level_execute_command(): starting 11067 1726773092.64444: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773092.0647478-11067-232878397549317/ > /dev/null 2>&1 && sleep 0' 11067 1726773092.67177: stderr chunk (state=2): >>><<< 11067 1726773092.67191: stdout chunk (state=2): >>><<< 11067 1726773092.67209: _low_level_execute_command() done: rc=0, stdout=, stderr= 11067 1726773092.67219: handler run complete 11067 1726773092.67245: attempt loop complete, returning result 11067 1726773092.67250: _execute() done 11067 1726773092.67253: dumping result to json 11067 1726773092.67258: done dumping result, returning 11067 1726773092.67266: done running TaskExecutor() for managed_node3/TASK: Set profile_mode to auto [0affffe7-6841-7dd6-8fa6-00000000058f] 11067 1726773092.67273: sending task result for task 0affffe7-6841-7dd6-8fa6-00000000058f 11067 1726773092.67319: done sending task result for task 0affffe7-6841-7dd6-8fa6-00000000058f 11067 1726773092.67323: WORKER PROCESS EXITING changed: [managed_node3] => { "changed": true, "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "md5sum": "451e20aff0f489cd2f7d4d73533aa961", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 5, "src": "/root/.ansible/tmp/ansible-tmp-1726773092.0647478-11067-232878397549317/source", "state": "file", "uid": 0 } 9733 1726773092.67744: no more pending results, returning what we have 9733 1726773092.67747: results queue empty 9733 1726773092.67748: checking for any_errors_fatal 9733 1726773092.67758: done checking for any_errors_fatal 9733 1726773092.67758: checking for max_fail_percentage 9733 1726773092.67760: done checking for max_fail_percentage 9733 1726773092.67761: checking to see if all hosts have failed and the running result is not ok 9733 1726773092.67761: done checking to see if all hosts have failed 9733 1726773092.67762: getting the remaining hosts for this loop 9733 1726773092.67763: done getting the remaining hosts for this loop 9733 1726773092.67766: getting the next task for host managed_node3 9733 1726773092.67772: done getting next task for host managed_node3 9733 1726773092.67774: ^ task is: TASK: Restart tuned 9733 1726773092.67777: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9733 1726773092.67780: getting variables 9733 1726773092.67782: in VariableManager get_vars() 9733 1726773092.67817: Calling all_inventory to load vars for managed_node3 9733 1726773092.67819: Calling groups_inventory to load vars for managed_node3 9733 1726773092.67821: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773092.67831: Calling all_plugins_play to load vars for managed_node3 9733 1726773092.67834: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773092.67836: Calling groups_plugins_play to load vars for managed_node3 9733 1726773092.68059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773092.68255: done with get_vars() 9733 1726773092.68266: done getting variables 9733 1726773092.68326: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Restart tuned] *********************************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:64 Thursday 19 September 2024 15:11:32 -0400 (0:00:00.658) 0:00:38.416 **** 9733 1726773092.68356: entering _queue_task() for managed_node3/service 9733 1726773092.68561: worker is 1 (out of 1 available) 9733 1726773092.68576: exiting _queue_task() for managed_node3/service 9733 1726773092.68589: done queuing things up, now waiting for results queue to drain 9733 1726773092.68591: waiting for pending results... 11099 1726773092.68920: running TaskExecutor() for managed_node3/TASK: Restart tuned 11099 1726773092.69060: in run() - task 0affffe7-6841-7dd6-8fa6-000000000590 11099 1726773092.69080: variable 'ansible_search_path' from source: unknown 11099 1726773092.69084: variable 'ansible_search_path' from source: unknown 11099 1726773092.69132: variable '__kernel_settings_services' from source: include_vars 11099 1726773092.69458: variable '__kernel_settings_services' from source: include_vars 11099 1726773092.69535: variable 'omit' from source: magic vars 11099 1726773092.69658: variable 'ansible_host' from source: host vars for 'managed_node3' 11099 1726773092.69672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11099 1726773092.69682: variable 'omit' from source: magic vars 11099 1726773092.69761: variable 'omit' from source: magic vars 11099 1726773092.69808: variable 'omit' from source: magic vars 11099 1726773092.69852: variable 'item' from source: unknown 11099 1726773092.69935: variable 'item' from source: unknown 11099 1726773092.69962: variable 'omit' from source: magic vars 11099 1726773092.70001: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11099 1726773092.70034: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11099 1726773092.70056: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11099 1726773092.70073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11099 1726773092.70087: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11099 1726773092.70118: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11099 1726773092.70124: variable 'ansible_host' from source: host vars for 'managed_node3' 11099 1726773092.70128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11099 1726773092.70229: Set connection var ansible_timeout to 10 11099 1726773092.70235: Set connection var ansible_shell_type to sh 11099 1726773092.70241: Set connection var ansible_module_compression to ZIP_DEFLATED 11099 1726773092.70246: Set connection var ansible_shell_executable to /bin/sh 11099 1726773092.70252: Set connection var ansible_pipelining to False 11099 1726773092.70259: Set connection var ansible_connection to ssh 11099 1726773092.70276: variable 'ansible_shell_executable' from source: unknown 11099 1726773092.70281: variable 'ansible_connection' from source: unknown 11099 1726773092.70284: variable 'ansible_module_compression' from source: unknown 11099 1726773092.70289: variable 'ansible_shell_type' from source: unknown 11099 1726773092.70292: variable 'ansible_shell_executable' from source: unknown 11099 1726773092.70295: variable 'ansible_host' from source: host vars for 'managed_node3' 11099 1726773092.70299: variable 'ansible_pipelining' from source: unknown 11099 1726773092.70302: variable 'ansible_timeout' from source: unknown 11099 1726773092.70306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11099 1726773092.70432: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11099 1726773092.70445: variable 'omit' from source: magic vars 11099 1726773092.70452: starting attempt loop 11099 1726773092.70456: running the handler 11099 1726773092.70543: variable 'ansible_facts' from source: unknown 11099 1726773092.70662: _low_level_execute_command(): starting 11099 1726773092.70671: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11099 1726773092.73577: stdout chunk (state=2): >>>/root <<< 11099 1726773092.73591: stderr chunk (state=2): >>><<< 11099 1726773092.73605: stdout chunk (state=3): >>><<< 11099 1726773092.73620: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11099 1726773092.73634: _low_level_execute_command(): starting 11099 1726773092.73639: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773092.736276-11099-75757844840811 `" && echo ansible-tmp-1726773092.736276-11099-75757844840811="` echo /root/.ansible/tmp/ansible-tmp-1726773092.736276-11099-75757844840811 `" ) && sleep 0' 11099 1726773092.76446: stdout chunk (state=2): >>>ansible-tmp-1726773092.736276-11099-75757844840811=/root/.ansible/tmp/ansible-tmp-1726773092.736276-11099-75757844840811 <<< 11099 1726773092.76606: stderr chunk (state=3): >>><<< 11099 1726773092.76614: stdout chunk (state=3): >>><<< 11099 1726773092.76637: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773092.736276-11099-75757844840811=/root/.ansible/tmp/ansible-tmp-1726773092.736276-11099-75757844840811 , stderr= 11099 1726773092.76669: variable 'ansible_module_compression' from source: unknown 11099 1726773092.76726: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9733e80v_zqz/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 11099 1726773092.76790: variable 'ansible_facts' from source: unknown 11099 1726773092.77010: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773092.736276-11099-75757844840811/AnsiballZ_systemd.py 11099 1726773092.77548: Sending initial data 11099 1726773092.77554: Sent initial data (153 bytes) 11099 1726773092.80376: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9733e80v_zqz/tmp63gb3o8t /root/.ansible/tmp/ansible-tmp-1726773092.736276-11099-75757844840811/AnsiballZ_systemd.py <<< 11099 1726773092.83197: stderr chunk (state=3): >>><<< 11099 1726773092.83208: stdout chunk (state=3): >>><<< 11099 1726773092.83233: done transferring module to remote 11099 1726773092.83246: _low_level_execute_command(): starting 11099 1726773092.83253: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773092.736276-11099-75757844840811/ /root/.ansible/tmp/ansible-tmp-1726773092.736276-11099-75757844840811/AnsiballZ_systemd.py && sleep 0' 11099 1726773092.86071: stderr chunk (state=2): >>><<< 11099 1726773092.86081: stdout chunk (state=2): >>><<< 11099 1726773092.86098: _low_level_execute_command() done: rc=0, stdout=, stderr= 11099 1726773092.86105: _low_level_execute_command(): starting 11099 1726773092.86110: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773092.736276-11099-75757844840811/AnsiballZ_systemd.py && sleep 0' 11099 1726773093.13665: stdout chunk (state=2): >>> {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:11:05 EDT", "WatchdogTimestampMonotonic": "480455087", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "15004", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ExecMainStartTimestampMonotonic": "480313127", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "15004", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:11:05 EDT] ; stop_time=[n/a] ; pid=15004 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "15405056", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "Memo<<< 11099 1726773093.13705: stdout chunk (state=3): >>>ryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "power-profiles-daemon.service auto-cpufreq.service tlp.service cpupower.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "systemd-journald.socket polkit.service systemd-sysctl.service dbus.socket basic.target system.slice network.target dbus.service sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:11:05 EDT", "StateChangeTimestampMonotonic": "480455091", "InactiveExitTimestamp": "Thu 2024-09-19 15:11:05 EDT", "InactiveExitTimestampMonotonic": "480313185", "ActiveEnterTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ActiveEnterTimestampMonotonic": "480455091", "ActiveExitTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ActiveExitTimestampMonotonic": "480218270", "InactiveEnterTimestamp": "Thu 2024-09-19 15:11:05 EDT", "InactiveEnterTimestampMonotonic": "480310218", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ConditionTimestampMonotonic": "480311469", "AssertTimestamp": "Thu 2024-09-19 15:11:05 EDT", "AssertTimestampMonotonic": "480311470", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5c390172c7314a188777ca74147bd412", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11099 1726773093.15347: stderr chunk (state=3): >>>Shared connection to 10.31.47.99 closed. <<< 11099 1726773093.15395: stderr chunk (state=3): >>><<< 11099 1726773093.15408: stdout chunk (state=3): >>><<< 11099 1726773093.15424: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:11:05 EDT", "WatchdogTimestampMonotonic": "480455087", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "15004", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ExecMainStartTimestampMonotonic": "480313127", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "15004", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:11:05 EDT] ; stop_time=[n/a] ; pid=15004 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "15405056", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "power-profiles-daemon.service auto-cpufreq.service tlp.service cpupower.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "systemd-journald.socket polkit.service systemd-sysctl.service dbus.socket basic.target system.slice network.target dbus.service sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:11:05 EDT", "StateChangeTimestampMonotonic": "480455091", "InactiveExitTimestamp": "Thu 2024-09-19 15:11:05 EDT", "InactiveExitTimestampMonotonic": "480313185", "ActiveEnterTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ActiveEnterTimestampMonotonic": "480455091", "ActiveExitTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ActiveExitTimestampMonotonic": "480218270", "InactiveEnterTimestamp": "Thu 2024-09-19 15:11:05 EDT", "InactiveEnterTimestampMonotonic": "480310218", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ConditionTimestampMonotonic": "480311469", "AssertTimestamp": "Thu 2024-09-19 15:11:05 EDT", "AssertTimestampMonotonic": "480311470", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5c390172c7314a188777ca74147bd412", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.47.99 closed. 11099 1726773093.15816: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773092.736276-11099-75757844840811/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11099 1726773093.15836: _low_level_execute_command(): starting 11099 1726773093.15843: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773092.736276-11099-75757844840811/ > /dev/null 2>&1 && sleep 0' 11099 1726773093.18322: stderr chunk (state=2): >>><<< 11099 1726773093.18332: stdout chunk (state=2): >>><<< 11099 1726773093.18347: _low_level_execute_command() done: rc=0, stdout=, stderr= 11099 1726773093.18354: handler run complete 11099 1726773093.18391: attempt loop complete, returning result 11099 1726773093.18410: variable 'item' from source: unknown 11099 1726773093.18471: variable 'item' from source: unknown ok: [managed_node3] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ActiveEnterTimestampMonotonic": "480455091", "ActiveExitTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ActiveExitTimestampMonotonic": "480218270", "ActiveState": "active", "After": "systemd-journald.socket polkit.service systemd-sysctl.service dbus.socket basic.target system.slice network.target dbus.service sysinit.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 15:11:05 EDT", "AssertTimestampMonotonic": "480311470", "Before": "shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ConditionTimestampMonotonic": "480311469", "ConfigurationDirectoryMode": "0755", "Conflicts": "power-profiles-daemon.service auto-cpufreq.service tlp.service cpupower.service shutdown.target", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "15004", "ExecMainStartTimestamp": "Thu 2024-09-19 15:11:05 EDT", "ExecMainStartTimestampMonotonic": "480313127", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:11:05 EDT] ; stop_time=[n/a] ; pid=15004 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestamp": "Thu 2024-09-19 15:11:05 EDT", "InactiveEnterTimestampMonotonic": "480310218", "InactiveExitTimestamp": "Thu 2024-09-19 15:11:05 EDT", "InactiveExitTimestampMonotonic": "480313185", "InvocationID": "5c390172c7314a188777ca74147bd412", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "15004", "MemoryAccounting": "yes", "MemoryCurrent": "15405056", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 15:11:05 EDT", "StateChangeTimestampMonotonic": "480455091", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 15:11:05 EDT", "WatchdogTimestampMonotonic": "480455087", "WatchdogUSec": "0" } } 11099 1726773093.18571: dumping result to json 11099 1726773093.18591: done dumping result, returning 11099 1726773093.18599: done running TaskExecutor() for managed_node3/TASK: Restart tuned [0affffe7-6841-7dd6-8fa6-000000000590] 11099 1726773093.18605: sending task result for task 0affffe7-6841-7dd6-8fa6-000000000590 11099 1726773093.18712: done sending task result for task 0affffe7-6841-7dd6-8fa6-000000000590 11099 1726773093.18716: WORKER PROCESS EXITING 9733 1726773093.19274: no more pending results, returning what we have 9733 1726773093.19277: results queue empty 9733 1726773093.19277: checking for any_errors_fatal 9733 1726773093.19280: done checking for any_errors_fatal 9733 1726773093.19280: checking for max_fail_percentage 9733 1726773093.19281: done checking for max_fail_percentage 9733 1726773093.19282: checking to see if all hosts have failed and the running result is not ok 9733 1726773093.19282: done checking to see if all hosts have failed 9733 1726773093.19283: getting the remaining hosts for this loop 9733 1726773093.19283: done getting the remaining hosts for this loop 9733 1726773093.19290: getting the next task for host managed_node3 9733 1726773093.19295: done getting next task for host managed_node3 9733 1726773093.19296: ^ task is: TASK: meta (flush_handlers) 9733 1726773093.19297: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773093.19301: getting variables 9733 1726773093.19302: in VariableManager get_vars() 9733 1726773093.19322: Calling all_inventory to load vars for managed_node3 9733 1726773093.19324: Calling groups_inventory to load vars for managed_node3 9733 1726773093.19325: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773093.19334: Calling all_plugins_play to load vars for managed_node3 9733 1726773093.19336: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773093.19338: Calling groups_plugins_play to load vars for managed_node3 9733 1726773093.19431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773093.19536: done with get_vars() 9733 1726773093.19544: done getting variables 9733 1726773093.19595: in VariableManager get_vars() 9733 1726773093.19604: Calling all_inventory to load vars for managed_node3 9733 1726773093.19605: Calling groups_inventory to load vars for managed_node3 9733 1726773093.19606: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773093.19609: Calling all_plugins_play to load vars for managed_node3 9733 1726773093.19611: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773093.19612: Calling groups_plugins_play to load vars for managed_node3 9733 1726773093.19699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773093.19799: done with get_vars() 9733 1726773093.19811: done queuing things up, now waiting for results queue to drain 9733 1726773093.19813: results queue empty 9733 1726773093.19813: checking for any_errors_fatal 9733 1726773093.19818: done checking for any_errors_fatal 9733 1726773093.19818: checking for max_fail_percentage 9733 1726773093.19819: done checking for max_fail_percentage 9733 1726773093.19819: checking to see if all hosts have failed and the running result is not ok 9733 1726773093.19819: done checking to see if all hosts have failed 9733 1726773093.19820: getting the remaining hosts for this loop 9733 1726773093.19820: done getting the remaining hosts for this loop 9733 1726773093.19821: getting the next task for host managed_node3 9733 1726773093.19825: done getting next task for host managed_node3 9733 1726773093.19825: ^ task is: TASK: meta (flush_handlers) 9733 1726773093.19826: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773093.19829: getting variables 9733 1726773093.19829: in VariableManager get_vars() 9733 1726773093.19837: Calling all_inventory to load vars for managed_node3 9733 1726773093.19838: Calling groups_inventory to load vars for managed_node3 9733 1726773093.19840: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773093.19842: Calling all_plugins_play to load vars for managed_node3 9733 1726773093.19844: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773093.19845: Calling groups_plugins_play to load vars for managed_node3 9733 1726773093.19920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773093.20024: done with get_vars() 9733 1726773093.20029: done getting variables 9733 1726773093.20056: in VariableManager get_vars() 9733 1726773093.20063: Calling all_inventory to load vars for managed_node3 9733 1726773093.20064: Calling groups_inventory to load vars for managed_node3 9733 1726773093.20065: Calling all_plugins_inventory to load vars for managed_node3 9733 1726773093.20068: Calling all_plugins_play to load vars for managed_node3 9733 1726773093.20069: Calling groups_plugins_inventory to load vars for managed_node3 9733 1726773093.20070: Calling groups_plugins_play to load vars for managed_node3 9733 1726773093.20144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9733 1726773093.20244: done with get_vars() 9733 1726773093.20252: done queuing things up, now waiting for results queue to drain 9733 1726773093.20253: results queue empty 9733 1726773093.20253: checking for any_errors_fatal 9733 1726773093.20255: done checking for any_errors_fatal 9733 1726773093.20255: checking for max_fail_percentage 9733 1726773093.20256: done checking for max_fail_percentage 9733 1726773093.20256: checking to see if all hosts have failed and the running result is not ok 9733 1726773093.20256: done checking to see if all hosts have failed 9733 1726773093.20257: getting the remaining hosts for this loop 9733 1726773093.20257: done getting the remaining hosts for this loop 9733 1726773093.20258: getting the next task for host managed_node3 9733 1726773093.20260: done getting next task for host managed_node3 9733 1726773093.20261: ^ task is: None 9733 1726773093.20261: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9733 1726773093.20262: done queuing things up, now waiting for results queue to drain 9733 1726773093.20262: results queue empty 9733 1726773093.20263: checking for any_errors_fatal 9733 1726773093.20263: done checking for any_errors_fatal 9733 1726773093.20263: checking for max_fail_percentage 9733 1726773093.20264: done checking for max_fail_percentage 9733 1726773093.20264: checking to see if all hosts have failed and the running result is not ok 9733 1726773093.20264: done checking to see if all hosts have failed 9733 1726773093.20265: getting the next task for host managed_node3 9733 1726773093.20267: done getting next task for host managed_node3 9733 1726773093.20267: ^ task is: None 9733 1726773093.20268: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node3 : ok=84 changed=12 unreachable=0 failed=0 skipped=46 rescued=0 ignored=0 Thursday 19 September 2024 15:11:33 -0400 (0:00:00.519) 0:00:38.935 **** =============================================================================== fedora.linux_system_roles.kernel_settings : Ensure required packages are installed --- 2.90s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 fedora.linux_system_roles.kernel_settings : Ensure required packages are installed --- 2.84s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 Ensure python command exists for tests below ---------------------------- 2.81s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings_conf_files.yml:14 fedora.linux_system_roles.kernel_settings : Ensure required packages are installed --- 2.81s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 fedora.linux_system_roles.kernel_settings : Tuned apply settings -------- 1.50s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 Gathering Facts --------------------------------------------------------- 1.14s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_simple_settings.yml:2 fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile --- 0.76s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes --- 0.75s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started --- 0.73s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory --- 0.71s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 Ensure kernel_settings is not in active_profile ------------------------- 0.71s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:46 fedora.linux_system_roles.kernel_settings : Apply kernel settings ------- 0.70s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory --- 0.70s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 fedora.linux_system_roles.kernel_settings : Apply kernel settings ------- 0.69s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory --- 0.68s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 Put expected contents into temporary file ------------------------------- 0.67s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/assert_kernel_settings_conf_files.yml:8 Set profile_mode to auto ------------------------------------------------ 0.66s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:57 fedora.linux_system_roles.kernel_settings : Apply kernel settings ------- 0.66s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 fedora.linux_system_roles.kernel_settings : Set profile_mode to manual --- 0.65s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly --- 0.63s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:2 9733 1726773093.20351: RUNNING CLEANUP