[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 8283 1726776616.07062: starting run ansible-playbook [core 2.16.11] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-uMf executable location = /usr/local/bin/ansible-playbook python version = 3.12.1 (main, Feb 21 2024, 14:18:26) [GCC 8.5.0 20210514 (Red Hat 8.5.0-21)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 8283 1726776616.07502: Added group all to inventory 8283 1726776616.07504: Added group ungrouped to inventory 8283 1726776616.07508: Group all now contains ungrouped 8283 1726776616.07511: Examining possible inventory source: /tmp/kernel_settings-iny/inventory.yml 8283 1726776616.22267: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 8283 1726776616.22311: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 8283 1726776616.22328: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 8283 1726776616.22370: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 8283 1726776616.22419: Loaded config def from plugin (inventory/script) 8283 1726776616.22420: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 8283 1726776616.22449: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 8283 1726776616.22506: Loaded config def from plugin (inventory/yaml) 8283 1726776616.22507: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 8283 1726776616.22570: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 8283 1726776616.22846: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 8283 1726776616.22848: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 8283 1726776616.22850: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 8283 1726776616.22854: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 8283 1726776616.22858: Loading data from /tmp/kernel_settings-iny/inventory.yml 8283 1726776616.22898: /tmp/kernel_settings-iny/inventory.yml was not parsable by auto 8283 1726776616.22943: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 8283 1726776616.22973: Loading data from /tmp/kernel_settings-iny/inventory.yml 8283 1726776616.23024: group all already in inventory 8283 1726776616.23030: set inventory_file for managed_node1 8283 1726776616.23033: set inventory_dir for managed_node1 8283 1726776616.23034: Added host managed_node1 to inventory 8283 1726776616.23035: Added host managed_node1 to group all 8283 1726776616.23036: set ansible_host for managed_node1 8283 1726776616.23036: set ansible_ssh_extra_args for managed_node1 8283 1726776616.23038: set inventory_file for managed_node2 8283 1726776616.23040: set inventory_dir for managed_node2 8283 1726776616.23040: Added host managed_node2 to inventory 8283 1726776616.23041: Added host managed_node2 to group all 8283 1726776616.23042: set ansible_host for managed_node2 8283 1726776616.23042: set ansible_ssh_extra_args for managed_node2 8283 1726776616.23043: set inventory_file for managed_node3 8283 1726776616.23045: set inventory_dir for managed_node3 8283 1726776616.23045: Added host managed_node3 to inventory 8283 1726776616.23046: Added host managed_node3 to group all 8283 1726776616.23046: set ansible_host for managed_node3 8283 1726776616.23047: set ansible_ssh_extra_args for managed_node3 8283 1726776616.23048: Reconcile groups and hosts in inventory. 8283 1726776616.23051: Group ungrouped now contains managed_node1 8283 1726776616.23053: Group ungrouped now contains managed_node2 8283 1726776616.23054: Group ungrouped now contains managed_node3 8283 1726776616.23105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 8283 1726776616.23187: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 8283 1726776616.23217: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 8283 1726776616.23236: Loaded config def from plugin (vars/host_group_vars) 8283 1726776616.23238: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 8283 1726776616.23242: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 8283 1726776616.23247: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8283 1726776616.23276: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 8283 1726776616.23537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776616.23609: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 8283 1726776616.23645: Loaded config def from plugin (connection/local) 8283 1726776616.23648: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 8283 1726776616.24181: Loaded config def from plugin (connection/paramiko_ssh) 8283 1726776616.24186: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 8283 1726776616.25007: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 8283 1726776616.25044: Loaded config def from plugin (connection/psrp) 8283 1726776616.25047: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 8283 1726776616.25713: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 8283 1726776616.25738: Loaded config def from plugin (connection/ssh) 8283 1726776616.25739: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 8283 1726776616.26889: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 8283 1726776616.26913: Loaded config def from plugin (connection/winrm) 8283 1726776616.26915: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 8283 1726776616.26937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 8283 1726776616.26979: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 8283 1726776616.27021: Loaded config def from plugin (shell/cmd) 8283 1726776616.27022: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 8283 1726776616.27041: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 8283 1726776616.27078: Loaded config def from plugin (shell/powershell) 8283 1726776616.27079: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 8283 1726776616.27117: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 8283 1726776616.27218: Loaded config def from plugin (shell/sh) 8283 1726776616.27220: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 8283 1726776616.27245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 8283 1726776616.27318: Loaded config def from plugin (become/runas) 8283 1726776616.27319: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 8283 1726776616.27428: Loaded config def from plugin (become/su) 8283 1726776616.27431: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 8283 1726776616.27522: Loaded config def from plugin (become/sudo) 8283 1726776616.27523: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 8283 1726776616.27558: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_default.yml 8283 1726776616.27913: in VariableManager get_vars() 8283 1726776616.27935: done with get_vars() 8283 1726776616.27976: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 8283 1726776616.27989: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 8283 1726776616.28332: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 8283 1726776616.28416: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 8283 1726776616.28418: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-uMf/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 8283 1726776616.28440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 8283 1726776616.28458: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 8283 1726776616.28559: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 8283 1726776616.28593: Loaded config def from plugin (callback/default) 8283 1726776616.28595: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 8283 1726776616.29361: Loaded config def from plugin (callback/junit) 8283 1726776616.29363: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 8283 1726776616.29392: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 8283 1726776616.29430: Loaded config def from plugin (callback/minimal) 8283 1726776616.29432: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 8283 1726776616.29459: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 8283 1726776616.29496: Loaded config def from plugin (callback/tree) 8283 1726776616.29497: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 8283 1726776616.29571: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 8283 1726776616.29572: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-uMf/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_default.yml **************************************************** 1 plays in /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_default.yml 8283 1726776616.29588: in VariableManager get_vars() 8283 1726776616.29598: done with get_vars() 8283 1726776616.29602: in VariableManager get_vars() 8283 1726776616.29607: done with get_vars() 8283 1726776616.29609: variable 'omit' from source: magic vars 8283 1726776616.29636: in VariableManager get_vars() 8283 1726776616.29645: done with get_vars() 8283 1726776616.29660: variable 'omit' from source: magic vars PLAY [Ensure that the role runs with default parameters] *********************** 8283 1726776616.31625: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 8283 1726776616.31679: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 8283 1726776616.31708: getting the remaining hosts for this loop 8283 1726776616.31709: done getting the remaining hosts for this loop 8283 1726776616.31712: getting the next task for host managed_node3 8283 1726776616.31715: done getting next task for host managed_node3 8283 1726776616.31716: ^ task is: TASK: meta (flush_handlers) 8283 1726776616.31717: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776616.31722: getting variables 8283 1726776616.31724: in VariableManager get_vars() 8283 1726776616.31733: Calling all_inventory to load vars for managed_node3 8283 1726776616.31735: Calling groups_inventory to load vars for managed_node3 8283 1726776616.31737: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776616.31745: Calling all_plugins_play to load vars for managed_node3 8283 1726776616.31751: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776616.31753: Calling groups_plugins_play to load vars for managed_node3 8283 1726776616.31777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776616.31805: done with get_vars() 8283 1726776616.31809: done getting variables 8283 1726776616.31927: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 8283 1726776616.31965: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 8283 1726776616.31996: in VariableManager get_vars() 8283 1726776616.32002: Calling all_inventory to load vars for managed_node3 8283 1726776616.32003: Calling groups_inventory to load vars for managed_node3 8283 1726776616.32004: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776616.32007: Calling all_plugins_play to load vars for managed_node3 8283 1726776616.32008: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776616.32010: Calling groups_plugins_play to load vars for managed_node3 8283 1726776616.32027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776616.32037: done with get_vars() 8283 1726776616.32044: done queuing things up, now waiting for results queue to drain 8283 1726776616.32045: results queue empty 8283 1726776616.32045: checking for any_errors_fatal 8283 1726776616.32047: done checking for any_errors_fatal 8283 1726776616.32047: checking for max_fail_percentage 8283 1726776616.32048: done checking for max_fail_percentage 8283 1726776616.32049: checking to see if all hosts have failed and the running result is not ok 8283 1726776616.32049: done checking to see if all hosts have failed 8283 1726776616.32050: getting the remaining hosts for this loop 8283 1726776616.32050: done getting the remaining hosts for this loop 8283 1726776616.32052: getting the next task for host managed_node3 8283 1726776616.32054: done getting next task for host managed_node3 8283 1726776616.32057: ^ task is: TASK: Run role with no settings 8283 1726776616.32058: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776616.32060: getting variables 8283 1726776616.32061: in VariableManager get_vars() 8283 1726776616.32066: Calling all_inventory to load vars for managed_node3 8283 1726776616.32067: Calling groups_inventory to load vars for managed_node3 8283 1726776616.32068: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776616.32071: Calling all_plugins_play to load vars for managed_node3 8283 1726776616.32072: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776616.32074: Calling groups_plugins_play to load vars for managed_node3 8283 1726776616.32091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776616.32099: done with get_vars() 8283 1726776616.32103: done getting variables TASK [Run role with no settings] *********************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_default.yml:8 Thursday 19 September 2024 16:10:16 -0400 (0:00:00.026) 0:00:00.026 **** 8283 1726776616.32146: entering _queue_task() for managed_node3/include_role 8283 1726776616.32147: Creating lock for include_role 8283 1726776616.32349: worker is 1 (out of 1 available) 8283 1726776616.32362: exiting _queue_task() for managed_node3/include_role 8283 1726776616.32372: done queuing things up, now waiting for results queue to drain 8283 1726776616.32373: waiting for pending results... 8296 1726776616.32452: running TaskExecutor() for managed_node3/TASK: Run role with no settings 8296 1726776616.32546: in run() - task 120fa90a-8a95-c4e4-06a7-000000000006 8296 1726776616.32561: variable 'ansible_search_path' from source: unknown 8296 1726776616.32589: calling self._execute() 8296 1726776616.32633: variable 'ansible_host' from source: host vars for 'managed_node3' 8296 1726776616.32641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8296 1726776616.32650: variable 'omit' from source: magic vars 8296 1726776616.32714: _execute() done 8296 1726776616.32720: dumping result to json 8296 1726776616.32725: done dumping result, returning 8296 1726776616.32731: done running TaskExecutor() for managed_node3/TASK: Run role with no settings [120fa90a-8a95-c4e4-06a7-000000000006] 8296 1726776616.32740: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000006 8296 1726776616.32766: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000006 8296 1726776616.32769: WORKER PROCESS EXITING 8283 1726776616.32876: no more pending results, returning what we have 8283 1726776616.32880: in VariableManager get_vars() 8283 1726776616.32902: Calling all_inventory to load vars for managed_node3 8283 1726776616.32904: Calling groups_inventory to load vars for managed_node3 8283 1726776616.32906: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776616.32914: Calling all_plugins_play to load vars for managed_node3 8283 1726776616.32917: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776616.32919: Calling groups_plugins_play to load vars for managed_node3 8283 1726776616.32951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776616.32962: done with get_vars() 8283 1726776616.32965: variable 'ansible_search_path' from source: unknown 8283 1726776616.33011: variable 'omit' from source: magic vars 8283 1726776616.33026: variable 'omit' from source: magic vars 8283 1726776616.33038: variable 'omit' from source: magic vars 8283 1726776616.33041: we have included files to process 8283 1726776616.33041: generating all_blocks data 8283 1726776616.33042: done generating all_blocks data 8283 1726776616.33042: processing included file: fedora.linux_system_roles.kernel_settings 8283 1726776616.33058: in VariableManager get_vars() 8283 1726776616.33065: done with get_vars() 8283 1726776616.33106: in VariableManager get_vars() 8283 1726776616.33114: done with get_vars() 8283 1726776616.33140: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/main.yml 8283 1726776616.33219: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/defaults/main.yml 8283 1726776616.33262: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/meta/main.yml 8283 1726776616.33352: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml 8283 1726776616.35342: trying /usr/local/lib/python3.12/site-packages/ansible/modules 8283 1726776616.35477: in VariableManager get_vars() 8283 1726776616.35491: done with get_vars() 8283 1726776616.36455: in VariableManager get_vars() 8283 1726776616.36472: done with get_vars() 8283 1726776616.36580: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/handlers/main.yml 8283 1726776616.37017: iterating over new_blocks loaded from include file 8283 1726776616.37018: in VariableManager get_vars() 8283 1726776616.37035: done with get_vars() 8283 1726776616.37037: filtering new block on tags 8283 1726776616.37050: done filtering new block on tags 8283 1726776616.37052: in VariableManager get_vars() 8283 1726776616.37065: done with get_vars() 8283 1726776616.37066: filtering new block on tags 8283 1726776616.37079: done filtering new block on tags 8283 1726776616.37081: in VariableManager get_vars() 8283 1726776616.37103: done with get_vars() 8283 1726776616.37104: filtering new block on tags 8283 1726776616.37126: done filtering new block on tags 8283 1726776616.37128: in VariableManager get_vars() 8283 1726776616.37138: done with get_vars() 8283 1726776616.37139: filtering new block on tags 8283 1726776616.37149: done filtering new block on tags 8283 1726776616.37150: done iterating over new_blocks loaded from include file 8283 1726776616.37151: extending task lists for all hosts with included blocks 8283 1726776616.37188: done extending task lists 8283 1726776616.37189: done processing included files 8283 1726776616.37189: results queue empty 8283 1726776616.37189: checking for any_errors_fatal 8283 1726776616.37191: done checking for any_errors_fatal 8283 1726776616.37192: checking for max_fail_percentage 8283 1726776616.37192: done checking for max_fail_percentage 8283 1726776616.37192: checking to see if all hosts have failed and the running result is not ok 8283 1726776616.37193: done checking to see if all hosts have failed 8283 1726776616.37193: getting the remaining hosts for this loop 8283 1726776616.37194: done getting the remaining hosts for this loop 8283 1726776616.37195: getting the next task for host managed_node3 8283 1726776616.37198: done getting next task for host managed_node3 8283 1726776616.37200: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8283 1726776616.37201: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776616.37207: getting variables 8283 1726776616.37208: in VariableManager get_vars() 8283 1726776616.37216: Calling all_inventory to load vars for managed_node3 8283 1726776616.37217: Calling groups_inventory to load vars for managed_node3 8283 1726776616.37218: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776616.37222: Calling all_plugins_play to load vars for managed_node3 8283 1726776616.37223: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776616.37224: Calling groups_plugins_play to load vars for managed_node3 8283 1726776616.37246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776616.37263: done with get_vars() 8283 1726776616.37268: done getting variables 8283 1726776616.37310: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:2 Thursday 19 September 2024 16:10:16 -0400 (0:00:00.051) 0:00:00.078 **** 8283 1726776616.37334: entering _queue_task() for managed_node3/fail 8283 1726776616.37335: Creating lock for fail 8283 1726776616.37520: worker is 1 (out of 1 available) 8283 1726776616.37534: exiting _queue_task() for managed_node3/fail 8283 1726776616.37546: done queuing things up, now waiting for results queue to drain 8283 1726776616.37548: waiting for pending results... 8297 1726776616.37646: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8297 1726776616.37747: in run() - task 120fa90a-8a95-c4e4-06a7-000000000023 8297 1726776616.37763: variable 'ansible_search_path' from source: unknown 8297 1726776616.37768: variable 'ansible_search_path' from source: unknown 8297 1726776616.37795: calling self._execute() 8297 1726776616.37841: variable 'ansible_host' from source: host vars for 'managed_node3' 8297 1726776616.37850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8297 1726776616.37858: variable 'omit' from source: magic vars 8297 1726776616.38176: variable 'kernel_settings_sysctl' from source: role '' defaults 8297 1726776616.38187: variable '__kernel_settings_state_empty' from source: role '' all vars 8297 1726776616.38199: Evaluated conditional (kernel_settings_sysctl != __kernel_settings_state_empty): True 8297 1726776616.38395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8297 1726776616.39908: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8297 1726776616.39967: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8297 1726776616.39995: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8297 1726776616.40022: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8297 1726776616.40044: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8297 1726776616.40098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8297 1726776616.40120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8297 1726776616.40142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8297 1726776616.40171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8297 1726776616.40183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8297 1726776616.40220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8297 1726776616.40239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8297 1726776616.40259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8297 1726776616.40285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8297 1726776616.40296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8297 1726776616.40323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8297 1726776616.40343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8297 1726776616.40362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8297 1726776616.40388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8297 1726776616.40399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8297 1726776616.40578: variable 'kernel_settings_sysctl' from source: role '' defaults 8297 1726776616.40600: Evaluated conditional ((kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", true) | list | length > 0) or (kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", false) | list | length > 0)): False 8297 1726776616.40606: when evaluation is False, skipping this task 8297 1726776616.40610: _execute() done 8297 1726776616.40614: dumping result to json 8297 1726776616.40619: done dumping result, returning 8297 1726776616.40625: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values [120fa90a-8a95-c4e4-06a7-000000000023] 8297 1726776616.40633: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000023 8297 1726776616.40655: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000023 8297 1726776616.40658: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "(kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", true) | list | length > 0) or (kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", false) | list | length > 0)", "skip_reason": "Conditional result was False" } 8283 1726776616.40782: no more pending results, returning what we have 8283 1726776616.40785: results queue empty 8283 1726776616.40785: checking for any_errors_fatal 8283 1726776616.40787: done checking for any_errors_fatal 8283 1726776616.40788: checking for max_fail_percentage 8283 1726776616.40789: done checking for max_fail_percentage 8283 1726776616.40789: checking to see if all hosts have failed and the running result is not ok 8283 1726776616.40790: done checking to see if all hosts have failed 8283 1726776616.40790: getting the remaining hosts for this loop 8283 1726776616.40791: done getting the remaining hosts for this loop 8283 1726776616.40795: getting the next task for host managed_node3 8283 1726776616.40800: done getting next task for host managed_node3 8283 1726776616.40803: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8283 1726776616.40806: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776616.40822: getting variables 8283 1726776616.40823: in VariableManager get_vars() 8283 1726776616.40853: Calling all_inventory to load vars for managed_node3 8283 1726776616.40858: Calling groups_inventory to load vars for managed_node3 8283 1726776616.40860: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776616.40868: Calling all_plugins_play to load vars for managed_node3 8283 1726776616.40870: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776616.40872: Calling groups_plugins_play to load vars for managed_node3 8283 1726776616.40900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776616.40920: done with get_vars() 8283 1726776616.40926: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Set version specific variables] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:9 Thursday 19 September 2024 16:10:16 -0400 (0:00:00.036) 0:00:00.114 **** 8283 1726776616.40993: entering _queue_task() for managed_node3/include_tasks 8283 1726776616.40994: Creating lock for include_tasks 8283 1726776616.41151: worker is 1 (out of 1 available) 8283 1726776616.41167: exiting _queue_task() for managed_node3/include_tasks 8283 1726776616.41176: done queuing things up, now waiting for results queue to drain 8283 1726776616.41177: waiting for pending results... 8298 1726776616.41266: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8298 1726776616.41360: in run() - task 120fa90a-8a95-c4e4-06a7-000000000024 8298 1726776616.41375: variable 'ansible_search_path' from source: unknown 8298 1726776616.41379: variable 'ansible_search_path' from source: unknown 8298 1726776616.41405: calling self._execute() 8298 1726776616.41451: variable 'ansible_host' from source: host vars for 'managed_node3' 8298 1726776616.41459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8298 1726776616.41464: variable 'omit' from source: magic vars 8298 1726776616.41527: _execute() done 8298 1726776616.41533: dumping result to json 8298 1726776616.41536: done dumping result, returning 8298 1726776616.41540: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables [120fa90a-8a95-c4e4-06a7-000000000024] 8298 1726776616.41546: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000024 8298 1726776616.41565: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000024 8298 1726776616.41567: WORKER PROCESS EXITING 8283 1726776616.41743: no more pending results, returning what we have 8283 1726776616.41746: in VariableManager get_vars() 8283 1726776616.41772: Calling all_inventory to load vars for managed_node3 8283 1726776616.41774: Calling groups_inventory to load vars for managed_node3 8283 1726776616.41775: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776616.41782: Calling all_plugins_play to load vars for managed_node3 8283 1726776616.41784: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776616.41785: Calling groups_plugins_play to load vars for managed_node3 8283 1726776616.41813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776616.41827: done with get_vars() 8283 1726776616.41834: variable 'ansible_search_path' from source: unknown 8283 1726776616.41834: variable 'ansible_search_path' from source: unknown 8283 1726776616.41861: we have included files to process 8283 1726776616.41861: generating all_blocks data 8283 1726776616.41862: done generating all_blocks data 8283 1726776616.41867: processing included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8283 1726776616.41867: loading included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8283 1726776616.41869: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml included: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml for managed_node3 8283 1726776616.42421: done processing included file 8283 1726776616.42422: iterating over new_blocks loaded from include file 8283 1726776616.42423: in VariableManager get_vars() 8283 1726776616.42440: done with get_vars() 8283 1726776616.42441: filtering new block on tags 8283 1726776616.42450: done filtering new block on tags 8283 1726776616.42452: in VariableManager get_vars() 8283 1726776616.42466: done with get_vars() 8283 1726776616.42467: filtering new block on tags 8283 1726776616.42478: done filtering new block on tags 8283 1726776616.42479: in VariableManager get_vars() 8283 1726776616.42490: done with get_vars() 8283 1726776616.42491: filtering new block on tags 8283 1726776616.42501: done filtering new block on tags 8283 1726776616.42502: in VariableManager get_vars() 8283 1726776616.42514: done with get_vars() 8283 1726776616.42514: filtering new block on tags 8283 1726776616.42521: done filtering new block on tags 8283 1726776616.42522: done iterating over new_blocks loaded from include file 8283 1726776616.42523: extending task lists for all hosts with included blocks 8283 1726776616.42616: done extending task lists 8283 1726776616.42617: done processing included files 8283 1726776616.42618: results queue empty 8283 1726776616.42618: checking for any_errors_fatal 8283 1726776616.42619: done checking for any_errors_fatal 8283 1726776616.42620: checking for max_fail_percentage 8283 1726776616.42620: done checking for max_fail_percentage 8283 1726776616.42621: checking to see if all hosts have failed and the running result is not ok 8283 1726776616.42621: done checking to see if all hosts have failed 8283 1726776616.42621: getting the remaining hosts for this loop 8283 1726776616.42622: done getting the remaining hosts for this loop 8283 1726776616.42623: getting the next task for host managed_node3 8283 1726776616.42626: done getting next task for host managed_node3 8283 1726776616.42628: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8283 1726776616.42631: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776616.42637: getting variables 8283 1726776616.42637: in VariableManager get_vars() 8283 1726776616.42645: Calling all_inventory to load vars for managed_node3 8283 1726776616.42646: Calling groups_inventory to load vars for managed_node3 8283 1726776616.42647: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776616.42652: Calling all_plugins_play to load vars for managed_node3 8283 1726776616.42654: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776616.42657: Calling groups_plugins_play to load vars for managed_node3 8283 1726776616.42675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776616.42690: done with get_vars() 8283 1726776616.42694: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:2 Thursday 19 September 2024 16:10:16 -0400 (0:00:00.017) 0:00:00.132 **** 8283 1726776616.42740: entering _queue_task() for managed_node3/setup 8283 1726776616.42893: worker is 1 (out of 1 available) 8283 1726776616.42904: exiting _queue_task() for managed_node3/setup 8283 1726776616.42914: done queuing things up, now waiting for results queue to drain 8283 1726776616.42916: waiting for pending results... 8299 1726776616.43015: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8299 1726776616.43120: in run() - task 120fa90a-8a95-c4e4-06a7-000000000085 8299 1726776616.43136: variable 'ansible_search_path' from source: unknown 8299 1726776616.43139: variable 'ansible_search_path' from source: unknown 8299 1726776616.43165: calling self._execute() 8299 1726776616.43208: variable 'ansible_host' from source: host vars for 'managed_node3' 8299 1726776616.43215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8299 1726776616.43224: variable 'omit' from source: magic vars 8299 1726776616.43565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8299 1726776616.45353: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8299 1726776616.45399: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8299 1726776616.45426: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8299 1726776616.45457: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8299 1726776616.45476: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8299 1726776616.45525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8299 1726776616.45550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8299 1726776616.45573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8299 1726776616.45600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8299 1726776616.45612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8299 1726776616.45650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8299 1726776616.45670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8299 1726776616.45690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8299 1726776616.45716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8299 1726776616.45727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8299 1726776616.45845: variable '__kernel_settings_required_facts' from source: role '' all vars 8299 1726776616.45858: variable 'ansible_facts' from source: unknown 8299 1726776616.45879: Evaluated conditional (__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0): True 8299 1726776616.45886: variable 'omit' from source: magic vars 8299 1726776616.45924: variable 'omit' from source: magic vars 8299 1726776616.45944: variable '__kernel_settings_required_facts_subsets' from source: role '' all vars 8299 1726776616.45999: variable '__kernel_settings_required_facts_subsets' from source: role '' all vars 8299 1726776616.46067: variable '__kernel_settings_required_facts' from source: role '' all vars 8299 1726776616.46122: variable 'omit' from source: magic vars 8299 1726776616.46144: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8299 1726776616.46175: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8299 1726776616.46190: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8299 1726776616.46204: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8299 1726776616.46214: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8299 1726776616.46242: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8299 1726776616.46247: variable 'ansible_host' from source: host vars for 'managed_node3' 8299 1726776616.46251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8299 1726776616.46312: Set connection var ansible_module_compression to ZIP_DEFLATED 8299 1726776616.46320: Set connection var ansible_shell_type to sh 8299 1726776616.46326: Set connection var ansible_timeout to 10 8299 1726776616.46332: Set connection var ansible_connection to ssh 8299 1726776616.46338: Set connection var ansible_pipelining to False 8299 1726776616.46341: Set connection var ansible_shell_executable to /bin/sh 8299 1726776616.46354: variable 'ansible_shell_executable' from source: unknown 8299 1726776616.46359: variable 'ansible_connection' from source: unknown 8299 1726776616.46361: variable 'ansible_module_compression' from source: unknown 8299 1726776616.46362: variable 'ansible_shell_type' from source: unknown 8299 1726776616.46364: variable 'ansible_shell_executable' from source: unknown 8299 1726776616.46365: variable 'ansible_host' from source: host vars for 'managed_node3' 8299 1726776616.46367: variable 'ansible_pipelining' from source: unknown 8299 1726776616.46369: variable 'ansible_timeout' from source: unknown 8299 1726776616.46371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8299 1726776616.46453: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 8299 1726776616.46463: variable 'omit' from source: magic vars 8299 1726776616.46467: starting attempt loop 8299 1726776616.46469: running the handler 8299 1726776616.46476: _low_level_execute_command(): starting 8299 1726776616.46481: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8299 1726776616.49007: stderr chunk (state=2): >>>Warning: Permanently added '10.31.8.186' (ECDSA) to the list of known hosts. <<< 8299 1726776616.62269: stdout chunk (state=3): >>>/root <<< 8299 1726776616.62507: stderr chunk (state=3): >>><<< 8299 1726776616.62518: stdout chunk (state=3): >>><<< 8299 1726776616.62542: _low_level_execute_command() done: rc=0, stdout=/root , stderr=Warning: Permanently added '10.31.8.186' (ECDSA) to the list of known hosts. 8299 1726776616.62554: _low_level_execute_command(): starting 8299 1726776616.62560: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776616.6254988-8299-14858424477494 `" && echo ansible-tmp-1726776616.6254988-8299-14858424477494="` echo /root/.ansible/tmp/ansible-tmp-1726776616.6254988-8299-14858424477494 `" ) && sleep 0' 8299 1726776616.65173: stdout chunk (state=2): >>>ansible-tmp-1726776616.6254988-8299-14858424477494=/root/.ansible/tmp/ansible-tmp-1726776616.6254988-8299-14858424477494 <<< 8299 1726776616.65315: stderr chunk (state=3): >>><<< 8299 1726776616.65326: stdout chunk (state=3): >>><<< 8299 1726776616.65348: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776616.6254988-8299-14858424477494=/root/.ansible/tmp/ansible-tmp-1726776616.6254988-8299-14858424477494 , stderr= 8299 1726776616.65396: variable 'ansible_module_compression' from source: unknown 8299 1726776616.65448: ANSIBALLZ: Using lock for setup 8299 1726776616.65454: ANSIBALLZ: Acquiring lock 8299 1726776616.65461: ANSIBALLZ: Lock acquired: 140690877998880 8299 1726776616.65466: ANSIBALLZ: Creating module 8299 1726776616.90185: ANSIBALLZ: Writing module into payload 8299 1726776616.90300: ANSIBALLZ: Writing module 8299 1726776616.90321: ANSIBALLZ: Renaming module 8299 1726776616.90327: ANSIBALLZ: Done creating module 8299 1726776616.90359: variable 'ansible_facts' from source: unknown 8299 1726776616.90366: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8299 1726776616.90376: _low_level_execute_command(): starting 8299 1726776616.90382: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'python3.6'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'/usr/libexec/platform-python'"'"'; command -v '"'"'python2.7'"'"'; command -v '"'"'/usr/bin/python'"'"'; command -v '"'"'python'"'"'; echo ENDFOUND && sleep 0' 8299 1726776616.92613: stdout chunk (state=2): >>>PLATFORM <<< 8299 1726776616.92676: stdout chunk (state=3): >>>Linux <<< 8299 1726776616.92691: stdout chunk (state=3): >>>FOUND <<< 8299 1726776616.92699: stdout chunk (state=3): >>>/usr/bin/python3.12 <<< 8299 1726776616.92716: stdout chunk (state=3): >>>/usr/bin/python3.6 <<< 8299 1726776616.92736: stdout chunk (state=3): >>>/usr/bin/python3 <<< 8299 1726776616.92744: stdout chunk (state=3): >>>/usr/libexec/platform-python ENDFOUND <<< 8299 1726776616.92885: stderr chunk (state=3): >>><<< 8299 1726776616.92891: stdout chunk (state=3): >>><<< 8299 1726776616.92904: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3.6 /usr/bin/python3 /usr/libexec/platform-python ENDFOUND , stderr= 8299 1726776616.92910 [managed_node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3.6', '/usr/bin/python3', '/usr/libexec/platform-python'] 8299 1726776616.92944: _low_level_execute_command(): starting 8299 1726776616.92950: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 8299 1726776616.93024: Sending initial data 8299 1726776616.93033: Sent initial data (1234 bytes) 8299 1726776616.96821: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"8\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"8\"\nPLATFORM_ID=\"platform:el8\"\nPRETTY_NAME=\"CentOS Stream 8\"\nANSI_COLOR=\"0;31\"\nCPE_NAME=\"cpe:/o:centos:centos:8\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 8\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 8299 1726776616.97254: stderr chunk (state=3): >>><<< 8299 1726776616.97261: stdout chunk (state=3): >>><<< 8299 1726776616.97275: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"8\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"8\"\nPLATFORM_ID=\"platform:el8\"\nPRETTY_NAME=\"CentOS Stream 8\"\nANSI_COLOR=\"0;31\"\nCPE_NAME=\"cpe:/o:centos:centos:8\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 8\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr= 8299 1726776616.97319: variable 'ansible_facts' from source: unknown 8299 1726776616.97326: variable 'ansible_facts' from source: unknown 8299 1726776616.97336: variable 'ansible_module_compression' from source: unknown 8299 1726776616.97366: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 8299 1726776616.97389: variable 'ansible_facts' from source: unknown 8299 1726776616.97530: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776616.6254988-8299-14858424477494/AnsiballZ_setup.py 8299 1726776616.97632: Sending initial data 8299 1726776616.97639: Sent initial data (151 bytes) 8299 1726776617.00257: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmp50r8dzoh /root/.ansible/tmp/ansible-tmp-1726776616.6254988-8299-14858424477494/AnsiballZ_setup.py <<< 8299 1726776617.02065: stderr chunk (state=3): >>><<< 8299 1726776617.02072: stdout chunk (state=3): >>><<< 8299 1726776617.02090: done transferring module to remote 8299 1726776617.02100: _low_level_execute_command(): starting 8299 1726776617.02105: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776616.6254988-8299-14858424477494/ /root/.ansible/tmp/ansible-tmp-1726776616.6254988-8299-14858424477494/AnsiballZ_setup.py && sleep 0' 8299 1726776617.04448: stderr chunk (state=2): >>><<< 8299 1726776617.04454: stdout chunk (state=2): >>><<< 8299 1726776617.04469: _low_level_execute_command() done: rc=0, stdout=, stderr= 8299 1726776617.04474: _low_level_execute_command(): starting 8299 1726776617.04479: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776616.6254988-8299-14858424477494/AnsiballZ_setup.py && sleep 0' 8299 1726776617.31087: stdout chunk (state=2): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "8", "ansible_distribution_major_version": "8", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "gather_subset": ["!all", "!min", "distribution", "distribution_major_version", "distribution_version", "os_family"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["!all", "!min", "distribution", "distribution_major_version", "distribution_version", "os_family"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 8299 1726776617.32674: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 8299 1726776617.32722: stderr chunk (state=3): >>><<< 8299 1726776617.32731: stdout chunk (state=3): >>><<< 8299 1726776617.32748: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "8", "ansible_distribution_major_version": "8", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "gather_subset": ["!all", "!min", "distribution", "distribution_major_version", "distribution_version", "os_family"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["!all", "!min", "distribution", "distribution_major_version", "distribution_version", "os_family"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=Shared connection to 10.31.8.186 closed. 8299 1726776617.32784: done with _execute_module (setup, {'gather_subset': ['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'], '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776616.6254988-8299-14858424477494/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8299 1726776617.32802: _low_level_execute_command(): starting 8299 1726776617.32810: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776616.6254988-8299-14858424477494/ > /dev/null 2>&1 && sleep 0' 8299 1726776617.35218: stderr chunk (state=2): >>><<< 8299 1726776617.35228: stdout chunk (state=2): >>><<< 8299 1726776617.35244: _low_level_execute_command() done: rc=0, stdout=, stderr= 8299 1726776617.35251: handler run complete 8299 1726776617.35265: variable 'ansible_facts' from source: unknown 8299 1726776617.35295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8299 1726776617.35330: variable 'ansible_facts' from source: unknown 8299 1726776617.35352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8299 1726776617.35368: attempt loop complete, returning result 8299 1726776617.35372: _execute() done 8299 1726776617.35375: dumping result to json 8299 1726776617.35381: done dumping result, returning 8299 1726776617.35389: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role [120fa90a-8a95-c4e4-06a7-000000000085] 8299 1726776617.35394: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000085 8299 1726776617.35421: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000085 8299 1726776617.35424: WORKER PROCESS EXITING ok: [managed_node3] 8283 1726776617.35572: no more pending results, returning what we have 8283 1726776617.35574: results queue empty 8283 1726776617.35575: checking for any_errors_fatal 8283 1726776617.35577: done checking for any_errors_fatal 8283 1726776617.35577: checking for max_fail_percentage 8283 1726776617.35579: done checking for max_fail_percentage 8283 1726776617.35579: checking to see if all hosts have failed and the running result is not ok 8283 1726776617.35580: done checking to see if all hosts have failed 8283 1726776617.35580: getting the remaining hosts for this loop 8283 1726776617.35581: done getting the remaining hosts for this loop 8283 1726776617.35584: getting the next task for host managed_node3 8283 1726776617.35592: done getting next task for host managed_node3 8283 1726776617.35595: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8283 1726776617.35599: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776617.35609: getting variables 8283 1726776617.35610: in VariableManager get_vars() 8283 1726776617.35640: Calling all_inventory to load vars for managed_node3 8283 1726776617.35643: Calling groups_inventory to load vars for managed_node3 8283 1726776617.35645: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776617.35653: Calling all_plugins_play to load vars for managed_node3 8283 1726776617.35655: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776617.35657: Calling groups_plugins_play to load vars for managed_node3 8283 1726776617.35710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776617.35741: done with get_vars() 8283 1726776617.35748: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if system is ostree] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:10 Thursday 19 September 2024 16:10:17 -0400 (0:00:00.930) 0:00:01.062 **** 8283 1726776617.35812: entering _queue_task() for managed_node3/stat 8283 1726776617.35965: worker is 1 (out of 1 available) 8283 1726776617.35979: exiting _queue_task() for managed_node3/stat 8283 1726776617.35989: done queuing things up, now waiting for results queue to drain 8283 1726776617.35990: waiting for pending results... 8322 1726776617.36092: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8322 1726776617.36202: in run() - task 120fa90a-8a95-c4e4-06a7-000000000087 8322 1726776617.36218: variable 'ansible_search_path' from source: unknown 8322 1726776617.36222: variable 'ansible_search_path' from source: unknown 8322 1726776617.36250: calling self._execute() 8322 1726776617.36296: variable 'ansible_host' from source: host vars for 'managed_node3' 8322 1726776617.36302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8322 1726776617.36307: variable 'omit' from source: magic vars 8322 1726776617.36620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8322 1726776617.36791: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8322 1726776617.36821: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8322 1726776617.36872: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8322 1726776617.36895: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8322 1726776617.36951: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8322 1726776617.36970: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8322 1726776617.36988: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8322 1726776617.37003: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8322 1726776617.37091: Evaluated conditional (not __kernel_settings_is_ostree is defined): True 8322 1726776617.37099: variable 'omit' from source: magic vars 8322 1726776617.37139: variable 'omit' from source: magic vars 8322 1726776617.37163: variable 'omit' from source: magic vars 8322 1726776617.37182: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8322 1726776617.37203: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8322 1726776617.37218: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8322 1726776617.37234: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8322 1726776617.37245: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8322 1726776617.37269: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8322 1726776617.37274: variable 'ansible_host' from source: host vars for 'managed_node3' 8322 1726776617.37278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8322 1726776617.37344: Set connection var ansible_module_compression to ZIP_DEFLATED 8322 1726776617.37352: Set connection var ansible_shell_type to sh 8322 1726776617.37360: Set connection var ansible_timeout to 10 8322 1726776617.37366: Set connection var ansible_connection to ssh 8322 1726776617.37373: Set connection var ansible_pipelining to False 8322 1726776617.37378: Set connection var ansible_shell_executable to /bin/sh 8322 1726776617.37393: variable 'ansible_shell_executable' from source: unknown 8322 1726776617.37396: variable 'ansible_connection' from source: unknown 8322 1726776617.37400: variable 'ansible_module_compression' from source: unknown 8322 1726776617.37403: variable 'ansible_shell_type' from source: unknown 8322 1726776617.37406: variable 'ansible_shell_executable' from source: unknown 8322 1726776617.37409: variable 'ansible_host' from source: host vars for 'managed_node3' 8322 1726776617.37413: variable 'ansible_pipelining' from source: unknown 8322 1726776617.37416: variable 'ansible_timeout' from source: unknown 8322 1726776617.37420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8322 1726776617.37509: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 8322 1726776617.37518: variable 'omit' from source: magic vars 8322 1726776617.37525: starting attempt loop 8322 1726776617.37530: running the handler 8322 1726776617.37541: _low_level_execute_command(): starting 8322 1726776617.37548: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8322 1726776617.39809: stdout chunk (state=2): >>>/root <<< 8322 1726776617.39933: stderr chunk (state=3): >>><<< 8322 1726776617.39940: stdout chunk (state=3): >>><<< 8322 1726776617.39956: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8322 1726776617.39970: _low_level_execute_command(): starting 8322 1726776617.39975: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776617.3996499-8322-145416960641155 `" && echo ansible-tmp-1726776617.3996499-8322-145416960641155="` echo /root/.ansible/tmp/ansible-tmp-1726776617.3996499-8322-145416960641155 `" ) && sleep 0' 8322 1726776617.42389: stdout chunk (state=2): >>>ansible-tmp-1726776617.3996499-8322-145416960641155=/root/.ansible/tmp/ansible-tmp-1726776617.3996499-8322-145416960641155 <<< 8322 1726776617.42513: stderr chunk (state=3): >>><<< 8322 1726776617.42520: stdout chunk (state=3): >>><<< 8322 1726776617.42535: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776617.3996499-8322-145416960641155=/root/.ansible/tmp/ansible-tmp-1726776617.3996499-8322-145416960641155 , stderr= 8322 1726776617.42569: variable 'ansible_module_compression' from source: unknown 8322 1726776617.42612: ANSIBALLZ: Using lock for stat 8322 1726776617.42616: ANSIBALLZ: Acquiring lock 8322 1726776617.42618: ANSIBALLZ: Lock acquired: 140690877998304 8322 1726776617.42620: ANSIBALLZ: Creating module 8322 1726776617.50992: ANSIBALLZ: Writing module into payload 8322 1726776617.51078: ANSIBALLZ: Writing module 8322 1726776617.51097: ANSIBALLZ: Renaming module 8322 1726776617.51104: ANSIBALLZ: Done creating module 8322 1726776617.51119: variable 'ansible_facts' from source: unknown 8322 1726776617.51178: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776617.3996499-8322-145416960641155/AnsiballZ_stat.py 8322 1726776617.51279: Sending initial data 8322 1726776617.51288: Sent initial data (151 bytes) 8322 1726776617.53804: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpjevbs5uh /root/.ansible/tmp/ansible-tmp-1726776617.3996499-8322-145416960641155/AnsiballZ_stat.py <<< 8322 1726776617.54783: stderr chunk (state=3): >>><<< 8322 1726776617.54791: stdout chunk (state=3): >>><<< 8322 1726776617.54809: done transferring module to remote 8322 1726776617.54820: _low_level_execute_command(): starting 8322 1726776617.54825: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776617.3996499-8322-145416960641155/ /root/.ansible/tmp/ansible-tmp-1726776617.3996499-8322-145416960641155/AnsiballZ_stat.py && sleep 0' 8322 1726776617.57171: stderr chunk (state=2): >>><<< 8322 1726776617.57178: stdout chunk (state=2): >>><<< 8322 1726776617.57191: _low_level_execute_command() done: rc=0, stdout=, stderr= 8322 1726776617.57195: _low_level_execute_command(): starting 8322 1726776617.57200: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776617.3996499-8322-145416960641155/AnsiballZ_stat.py && sleep 0' 8322 1726776617.72093: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8322 1726776617.73121: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 8322 1726776617.73173: stderr chunk (state=3): >>><<< 8322 1726776617.73180: stdout chunk (state=3): >>><<< 8322 1726776617.73195: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.8.186 closed. 8322 1726776617.73222: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776617.3996499-8322-145416960641155/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8322 1726776617.73234: _low_level_execute_command(): starting 8322 1726776617.73240: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776617.3996499-8322-145416960641155/ > /dev/null 2>&1 && sleep 0' 8322 1726776617.75606: stderr chunk (state=2): >>><<< 8322 1726776617.75614: stdout chunk (state=2): >>><<< 8322 1726776617.75631: _low_level_execute_command() done: rc=0, stdout=, stderr= 8322 1726776617.75638: handler run complete 8322 1726776617.75655: attempt loop complete, returning result 8322 1726776617.75661: _execute() done 8322 1726776617.75665: dumping result to json 8322 1726776617.75669: done dumping result, returning 8322 1726776617.75677: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree [120fa90a-8a95-c4e4-06a7-000000000087] 8322 1726776617.75682: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000087 8322 1726776617.75708: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000087 8322 1726776617.75712: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 8283 1726776617.75840: no more pending results, returning what we have 8283 1726776617.75842: results queue empty 8283 1726776617.75843: checking for any_errors_fatal 8283 1726776617.75849: done checking for any_errors_fatal 8283 1726776617.75849: checking for max_fail_percentage 8283 1726776617.75850: done checking for max_fail_percentage 8283 1726776617.75851: checking to see if all hosts have failed and the running result is not ok 8283 1726776617.75851: done checking to see if all hosts have failed 8283 1726776617.75852: getting the remaining hosts for this loop 8283 1726776617.75853: done getting the remaining hosts for this loop 8283 1726776617.75856: getting the next task for host managed_node3 8283 1726776617.75861: done getting next task for host managed_node3 8283 1726776617.75865: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8283 1726776617.75868: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776617.75877: getting variables 8283 1726776617.75878: in VariableManager get_vars() 8283 1726776617.75908: Calling all_inventory to load vars for managed_node3 8283 1726776617.75910: Calling groups_inventory to load vars for managed_node3 8283 1726776617.75912: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776617.75920: Calling all_plugins_play to load vars for managed_node3 8283 1726776617.75922: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776617.75924: Calling groups_plugins_play to load vars for managed_node3 8283 1726776617.75971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776617.76005: done with get_vars() 8283 1726776617.76011: done getting variables 8283 1726776617.76080: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:15 Thursday 19 September 2024 16:10:17 -0400 (0:00:00.402) 0:00:01.465 **** 8283 1726776617.76104: entering _queue_task() for managed_node3/set_fact 8283 1726776617.76105: Creating lock for set_fact 8283 1726776617.76268: worker is 1 (out of 1 available) 8283 1726776617.76281: exiting _queue_task() for managed_node3/set_fact 8283 1726776617.76291: done queuing things up, now waiting for results queue to drain 8283 1726776617.76293: waiting for pending results... 8330 1726776617.76397: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8330 1726776617.76502: in run() - task 120fa90a-8a95-c4e4-06a7-000000000088 8330 1726776617.76518: variable 'ansible_search_path' from source: unknown 8330 1726776617.76523: variable 'ansible_search_path' from source: unknown 8330 1726776617.76550: calling self._execute() 8330 1726776617.76595: variable 'ansible_host' from source: host vars for 'managed_node3' 8330 1726776617.76603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8330 1726776617.76611: variable 'omit' from source: magic vars 8330 1726776617.76924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8330 1726776617.77121: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8330 1726776617.77154: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8330 1726776617.77188: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8330 1726776617.77214: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8330 1726776617.77275: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8330 1726776617.77295: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8330 1726776617.77314: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8330 1726776617.77334: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8330 1726776617.77420: Evaluated conditional (not __kernel_settings_is_ostree is defined): True 8330 1726776617.77430: variable 'omit' from source: magic vars 8330 1726776617.77470: variable 'omit' from source: magic vars 8330 1726776617.77549: variable '__ostree_booted_stat' from source: set_fact 8330 1726776617.77587: variable 'omit' from source: magic vars 8330 1726776617.77609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8330 1726776617.77630: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8330 1726776617.77645: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8330 1726776617.77660: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8330 1726776617.77671: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8330 1726776617.77693: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8330 1726776617.77698: variable 'ansible_host' from source: host vars for 'managed_node3' 8330 1726776617.77702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8330 1726776617.77769: Set connection var ansible_module_compression to ZIP_DEFLATED 8330 1726776617.77776: Set connection var ansible_shell_type to sh 8330 1726776617.77783: Set connection var ansible_timeout to 10 8330 1726776617.77786: Set connection var ansible_connection to ssh 8330 1726776617.77791: Set connection var ansible_pipelining to False 8330 1726776617.77793: Set connection var ansible_shell_executable to /bin/sh 8330 1726776617.77806: variable 'ansible_shell_executable' from source: unknown 8330 1726776617.77809: variable 'ansible_connection' from source: unknown 8330 1726776617.77811: variable 'ansible_module_compression' from source: unknown 8330 1726776617.77812: variable 'ansible_shell_type' from source: unknown 8330 1726776617.77814: variable 'ansible_shell_executable' from source: unknown 8330 1726776617.77816: variable 'ansible_host' from source: host vars for 'managed_node3' 8330 1726776617.77819: variable 'ansible_pipelining' from source: unknown 8330 1726776617.77821: variable 'ansible_timeout' from source: unknown 8330 1726776617.77824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8330 1726776617.77882: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8330 1726776617.77890: variable 'omit' from source: magic vars 8330 1726776617.77895: starting attempt loop 8330 1726776617.77897: running the handler 8330 1726776617.77902: handler run complete 8330 1726776617.77907: attempt loop complete, returning result 8330 1726776617.77909: _execute() done 8330 1726776617.77911: dumping result to json 8330 1726776617.77913: done dumping result, returning 8330 1726776617.77917: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree [120fa90a-8a95-c4e4-06a7-000000000088] 8330 1726776617.77921: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000088 8330 1726776617.77941: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000088 8330 1726776617.77944: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__kernel_settings_is_ostree": false }, "changed": false } 8283 1726776617.78089: no more pending results, returning what we have 8283 1726776617.78092: results queue empty 8283 1726776617.78093: checking for any_errors_fatal 8283 1726776617.78097: done checking for any_errors_fatal 8283 1726776617.78097: checking for max_fail_percentage 8283 1726776617.78098: done checking for max_fail_percentage 8283 1726776617.78099: checking to see if all hosts have failed and the running result is not ok 8283 1726776617.78099: done checking to see if all hosts have failed 8283 1726776617.78100: getting the remaining hosts for this loop 8283 1726776617.78101: done getting the remaining hosts for this loop 8283 1726776617.78104: getting the next task for host managed_node3 8283 1726776617.78111: done getting next task for host managed_node3 8283 1726776617.78114: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8283 1726776617.78117: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776617.78125: getting variables 8283 1726776617.78126: in VariableManager get_vars() 8283 1726776617.78156: Calling all_inventory to load vars for managed_node3 8283 1726776617.78158: Calling groups_inventory to load vars for managed_node3 8283 1726776617.78160: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776617.78165: Calling all_plugins_play to load vars for managed_node3 8283 1726776617.78167: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776617.78168: Calling groups_plugins_play to load vars for managed_node3 8283 1726776617.78200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776617.78225: done with get_vars() 8283 1726776617.78233: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:22 Thursday 19 September 2024 16:10:17 -0400 (0:00:00.021) 0:00:01.487 **** 8283 1726776617.78295: entering _queue_task() for managed_node3/stat 8283 1726776617.78441: worker is 1 (out of 1 available) 8283 1726776617.78454: exiting _queue_task() for managed_node3/stat 8283 1726776617.78463: done queuing things up, now waiting for results queue to drain 8283 1726776617.78465: waiting for pending results... 8331 1726776617.78564: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8331 1726776617.78667: in run() - task 120fa90a-8a95-c4e4-06a7-00000000008a 8331 1726776617.78681: variable 'ansible_search_path' from source: unknown 8331 1726776617.78685: variable 'ansible_search_path' from source: unknown 8331 1726776617.78709: calling self._execute() 8331 1726776617.78753: variable 'ansible_host' from source: host vars for 'managed_node3' 8331 1726776617.78764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8331 1726776617.78772: variable 'omit' from source: magic vars 8331 1726776617.79111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8331 1726776617.79274: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8331 1726776617.79306: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8331 1726776617.79332: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8331 1726776617.79360: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8331 1726776617.79416: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8331 1726776617.79437: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8331 1726776617.79459: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8331 1726776617.79478: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8331 1726776617.79562: Evaluated conditional (not __kernel_settings_is_transactional is defined): True 8331 1726776617.79570: variable 'omit' from source: magic vars 8331 1726776617.79606: variable 'omit' from source: magic vars 8331 1726776617.79633: variable 'omit' from source: magic vars 8331 1726776617.79652: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8331 1726776617.79675: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8331 1726776617.79691: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8331 1726776617.79704: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8331 1726776617.79714: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8331 1726776617.79738: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8331 1726776617.79743: variable 'ansible_host' from source: host vars for 'managed_node3' 8331 1726776617.79747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8331 1726776617.79811: Set connection var ansible_module_compression to ZIP_DEFLATED 8331 1726776617.79818: Set connection var ansible_shell_type to sh 8331 1726776617.79825: Set connection var ansible_timeout to 10 8331 1726776617.79832: Set connection var ansible_connection to ssh 8331 1726776617.79839: Set connection var ansible_pipelining to False 8331 1726776617.79844: Set connection var ansible_shell_executable to /bin/sh 8331 1726776617.79860: variable 'ansible_shell_executable' from source: unknown 8331 1726776617.79864: variable 'ansible_connection' from source: unknown 8331 1726776617.79868: variable 'ansible_module_compression' from source: unknown 8331 1726776617.79871: variable 'ansible_shell_type' from source: unknown 8331 1726776617.79874: variable 'ansible_shell_executable' from source: unknown 8331 1726776617.79878: variable 'ansible_host' from source: host vars for 'managed_node3' 8331 1726776617.79882: variable 'ansible_pipelining' from source: unknown 8331 1726776617.79885: variable 'ansible_timeout' from source: unknown 8331 1726776617.79889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8331 1726776617.79979: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 8331 1726776617.79989: variable 'omit' from source: magic vars 8331 1726776617.79994: starting attempt loop 8331 1726776617.79998: running the handler 8331 1726776617.80007: _low_level_execute_command(): starting 8331 1726776617.80012: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8331 1726776617.82452: stdout chunk (state=2): >>>/root <<< 8331 1726776617.82463: stderr chunk (state=2): >>><<< 8331 1726776617.82475: stdout chunk (state=3): >>><<< 8331 1726776617.82492: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8331 1726776617.82505: _low_level_execute_command(): starting 8331 1726776617.82511: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776617.8249955-8331-265239974490546 `" && echo ansible-tmp-1726776617.8249955-8331-265239974490546="` echo /root/.ansible/tmp/ansible-tmp-1726776617.8249955-8331-265239974490546 `" ) && sleep 0' 8331 1726776617.85251: stdout chunk (state=2): >>>ansible-tmp-1726776617.8249955-8331-265239974490546=/root/.ansible/tmp/ansible-tmp-1726776617.8249955-8331-265239974490546 <<< 8331 1726776617.85376: stderr chunk (state=3): >>><<< 8331 1726776617.85383: stdout chunk (state=3): >>><<< 8331 1726776617.85399: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776617.8249955-8331-265239974490546=/root/.ansible/tmp/ansible-tmp-1726776617.8249955-8331-265239974490546 , stderr= 8331 1726776617.85436: variable 'ansible_module_compression' from source: unknown 8331 1726776617.85480: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8331 1726776617.85507: variable 'ansible_facts' from source: unknown 8331 1726776617.85580: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776617.8249955-8331-265239974490546/AnsiballZ_stat.py 8331 1726776617.85675: Sending initial data 8331 1726776617.85682: Sent initial data (151 bytes) 8331 1726776617.88067: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmp5023y_w9 /root/.ansible/tmp/ansible-tmp-1726776617.8249955-8331-265239974490546/AnsiballZ_stat.py <<< 8331 1726776617.89035: stderr chunk (state=3): >>><<< 8331 1726776617.89042: stdout chunk (state=3): >>><<< 8331 1726776617.89061: done transferring module to remote 8331 1726776617.89070: _low_level_execute_command(): starting 8331 1726776617.89076: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776617.8249955-8331-265239974490546/ /root/.ansible/tmp/ansible-tmp-1726776617.8249955-8331-265239974490546/AnsiballZ_stat.py && sleep 0' 8331 1726776617.91344: stderr chunk (state=2): >>><<< 8331 1726776617.91350: stdout chunk (state=2): >>><<< 8331 1726776617.91364: _low_level_execute_command() done: rc=0, stdout=, stderr= 8331 1726776617.91368: _low_level_execute_command(): starting 8331 1726776617.91373: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776617.8249955-8331-265239974490546/AnsiballZ_stat.py && sleep 0' 8331 1726776618.06149: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/sbin/transactional-update", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8331 1726776618.07172: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 8331 1726776618.07220: stderr chunk (state=3): >>><<< 8331 1726776618.07228: stdout chunk (state=3): >>><<< 8331 1726776618.07245: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/sbin/transactional-update", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.8.186 closed. 8331 1726776618.07273: done with _execute_module (stat, {'path': '/sbin/transactional-update', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776617.8249955-8331-265239974490546/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8331 1726776618.07284: _low_level_execute_command(): starting 8331 1726776618.07289: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776617.8249955-8331-265239974490546/ > /dev/null 2>&1 && sleep 0' 8331 1726776618.09695: stderr chunk (state=2): >>><<< 8331 1726776618.09705: stdout chunk (state=2): >>><<< 8331 1726776618.09721: _low_level_execute_command() done: rc=0, stdout=, stderr= 8331 1726776618.09727: handler run complete 8331 1726776618.09744: attempt loop complete, returning result 8331 1726776618.09748: _execute() done 8331 1726776618.09752: dumping result to json 8331 1726776618.09756: done dumping result, returning 8331 1726776618.09766: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin [120fa90a-8a95-c4e4-06a7-00000000008a] 8331 1726776618.09773: sending task result for task 120fa90a-8a95-c4e4-06a7-00000000008a 8331 1726776618.09802: done sending task result for task 120fa90a-8a95-c4e4-06a7-00000000008a 8331 1726776618.09806: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 8283 1726776618.09934: no more pending results, returning what we have 8283 1726776618.09937: results queue empty 8283 1726776618.09937: checking for any_errors_fatal 8283 1726776618.09941: done checking for any_errors_fatal 8283 1726776618.09942: checking for max_fail_percentage 8283 1726776618.09943: done checking for max_fail_percentage 8283 1726776618.09943: checking to see if all hosts have failed and the running result is not ok 8283 1726776618.09944: done checking to see if all hosts have failed 8283 1726776618.09944: getting the remaining hosts for this loop 8283 1726776618.09945: done getting the remaining hosts for this loop 8283 1726776618.09949: getting the next task for host managed_node3 8283 1726776618.09954: done getting next task for host managed_node3 8283 1726776618.09957: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8283 1726776618.09960: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776618.09969: getting variables 8283 1726776618.09970: in VariableManager get_vars() 8283 1726776618.10001: Calling all_inventory to load vars for managed_node3 8283 1726776618.10004: Calling groups_inventory to load vars for managed_node3 8283 1726776618.10006: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776618.10013: Calling all_plugins_play to load vars for managed_node3 8283 1726776618.10015: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776618.10018: Calling groups_plugins_play to load vars for managed_node3 8283 1726776618.10062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776618.10091: done with get_vars() 8283 1726776618.10097: done getting variables 8283 1726776618.10139: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:27 Thursday 19 September 2024 16:10:18 -0400 (0:00:00.318) 0:00:01.806 **** 8283 1726776618.10164: entering _queue_task() for managed_node3/set_fact 8283 1726776618.10350: worker is 1 (out of 1 available) 8283 1726776618.10363: exiting _queue_task() for managed_node3/set_fact 8283 1726776618.10374: done queuing things up, now waiting for results queue to drain 8283 1726776618.10375: waiting for pending results... 8346 1726776618.10477: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8346 1726776618.10590: in run() - task 120fa90a-8a95-c4e4-06a7-00000000008b 8346 1726776618.10605: variable 'ansible_search_path' from source: unknown 8346 1726776618.10609: variable 'ansible_search_path' from source: unknown 8346 1726776618.10636: calling self._execute() 8346 1726776618.10683: variable 'ansible_host' from source: host vars for 'managed_node3' 8346 1726776618.10691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8346 1726776618.10699: variable 'omit' from source: magic vars 8346 1726776618.11011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8346 1726776618.11183: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8346 1726776618.11215: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8346 1726776618.11241: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8346 1726776618.11269: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8346 1726776618.11326: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8346 1726776618.11347: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8346 1726776618.11369: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8346 1726776618.11388: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8346 1726776618.11476: Evaluated conditional (not __kernel_settings_is_transactional is defined): True 8346 1726776618.11483: variable 'omit' from source: magic vars 8346 1726776618.11522: variable 'omit' from source: magic vars 8346 1726776618.11600: variable '__transactional_update_stat' from source: set_fact 8346 1726776618.11638: variable 'omit' from source: magic vars 8346 1726776618.11660: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8346 1726776618.11680: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8346 1726776618.11694: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8346 1726776618.11705: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8346 1726776618.11712: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8346 1726776618.11737: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8346 1726776618.11741: variable 'ansible_host' from source: host vars for 'managed_node3' 8346 1726776618.11743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8346 1726776618.11804: Set connection var ansible_module_compression to ZIP_DEFLATED 8346 1726776618.11810: Set connection var ansible_shell_type to sh 8346 1726776618.11813: Set connection var ansible_timeout to 10 8346 1726776618.11817: Set connection var ansible_connection to ssh 8346 1726776618.11821: Set connection var ansible_pipelining to False 8346 1726776618.11825: Set connection var ansible_shell_executable to /bin/sh 8346 1726776618.11840: variable 'ansible_shell_executable' from source: unknown 8346 1726776618.11843: variable 'ansible_connection' from source: unknown 8346 1726776618.11845: variable 'ansible_module_compression' from source: unknown 8346 1726776618.11847: variable 'ansible_shell_type' from source: unknown 8346 1726776618.11849: variable 'ansible_shell_executable' from source: unknown 8346 1726776618.11850: variable 'ansible_host' from source: host vars for 'managed_node3' 8346 1726776618.11852: variable 'ansible_pipelining' from source: unknown 8346 1726776618.11854: variable 'ansible_timeout' from source: unknown 8346 1726776618.11856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8346 1726776618.11912: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8346 1726776618.11920: variable 'omit' from source: magic vars 8346 1726776618.11924: starting attempt loop 8346 1726776618.11926: running the handler 8346 1726776618.11934: handler run complete 8346 1726776618.11940: attempt loop complete, returning result 8346 1726776618.11942: _execute() done 8346 1726776618.11945: dumping result to json 8346 1726776618.11947: done dumping result, returning 8346 1726776618.11951: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists [120fa90a-8a95-c4e4-06a7-00000000008b] 8346 1726776618.11955: sending task result for task 120fa90a-8a95-c4e4-06a7-00000000008b 8346 1726776618.11972: done sending task result for task 120fa90a-8a95-c4e4-06a7-00000000008b 8346 1726776618.11974: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__kernel_settings_is_transactional": false }, "changed": false } 8283 1726776618.12172: no more pending results, returning what we have 8283 1726776618.12173: results queue empty 8283 1726776618.12174: checking for any_errors_fatal 8283 1726776618.12177: done checking for any_errors_fatal 8283 1726776618.12178: checking for max_fail_percentage 8283 1726776618.12178: done checking for max_fail_percentage 8283 1726776618.12179: checking to see if all hosts have failed and the running result is not ok 8283 1726776618.12179: done checking to see if all hosts have failed 8283 1726776618.12179: getting the remaining hosts for this loop 8283 1726776618.12180: done getting the remaining hosts for this loop 8283 1726776618.12182: getting the next task for host managed_node3 8283 1726776618.12187: done getting next task for host managed_node3 8283 1726776618.12189: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8283 1726776618.12191: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776618.12197: getting variables 8283 1726776618.12201: in VariableManager get_vars() 8283 1726776618.12224: Calling all_inventory to load vars for managed_node3 8283 1726776618.12226: Calling groups_inventory to load vars for managed_node3 8283 1726776618.12227: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776618.12235: Calling all_plugins_play to load vars for managed_node3 8283 1726776618.12236: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776618.12238: Calling groups_plugins_play to load vars for managed_node3 8283 1726776618.12271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776618.12296: done with get_vars() 8283 1726776618.12301: done getting variables 8283 1726776618.12377: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set platform/version specific variables] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:31 Thursday 19 September 2024 16:10:18 -0400 (0:00:00.022) 0:00:01.828 **** 8283 1726776618.12399: entering _queue_task() for managed_node3/include_vars 8283 1726776618.12400: Creating lock for include_vars 8283 1726776618.12543: worker is 1 (out of 1 available) 8283 1726776618.12557: exiting _queue_task() for managed_node3/include_vars 8283 1726776618.12570: done queuing things up, now waiting for results queue to drain 8283 1726776618.12571: waiting for pending results... 8347 1726776618.12663: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8347 1726776618.12766: in run() - task 120fa90a-8a95-c4e4-06a7-00000000008d 8347 1726776618.12780: variable 'ansible_search_path' from source: unknown 8347 1726776618.12783: variable 'ansible_search_path' from source: unknown 8347 1726776618.12809: calling self._execute() 8347 1726776618.12854: variable 'ansible_host' from source: host vars for 'managed_node3' 8347 1726776618.12861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8347 1726776618.12869: variable 'omit' from source: magic vars 8347 1726776618.12935: variable 'omit' from source: magic vars 8347 1726776618.12977: variable 'omit' from source: magic vars 8347 1726776618.13215: variable 'ffparams' from source: task vars 8347 1726776618.13306: variable 'ansible_facts' from source: unknown 8347 1726776618.13411: variable 'ansible_facts' from source: unknown 8347 1726776618.13474: variable 'ansible_facts' from source: unknown 8347 1726776618.13536: variable 'ansible_facts' from source: unknown 8347 1726776618.13586: variable 'role_path' from source: magic vars 8347 1726776618.13696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 8347 1726776618.13852: Loaded config def from plugin (lookup/first_found) 8347 1726776618.13860: Loading LookupModule 'first_found' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/first_found.py 8347 1726776618.13887: variable 'ansible_search_path' from source: unknown 8347 1726776618.13904: variable 'ansible_search_path' from source: unknown 8347 1726776618.13912: variable 'ansible_search_path' from source: unknown 8347 1726776618.13919: variable 'ansible_search_path' from source: unknown 8347 1726776618.13925: variable 'ansible_search_path' from source: unknown 8347 1726776618.13943: variable 'omit' from source: magic vars 8347 1726776618.13962: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8347 1726776618.13979: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8347 1726776618.13995: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8347 1726776618.14008: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8347 1726776618.14017: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8347 1726776618.14038: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8347 1726776618.14043: variable 'ansible_host' from source: host vars for 'managed_node3' 8347 1726776618.14046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8347 1726776618.14103: Set connection var ansible_module_compression to ZIP_DEFLATED 8347 1726776618.14109: Set connection var ansible_shell_type to sh 8347 1726776618.14112: Set connection var ansible_timeout to 10 8347 1726776618.14116: Set connection var ansible_connection to ssh 8347 1726776618.14121: Set connection var ansible_pipelining to False 8347 1726776618.14123: Set connection var ansible_shell_executable to /bin/sh 8347 1726776618.14141: variable 'ansible_shell_executable' from source: unknown 8347 1726776618.14145: variable 'ansible_connection' from source: unknown 8347 1726776618.14148: variable 'ansible_module_compression' from source: unknown 8347 1726776618.14151: variable 'ansible_shell_type' from source: unknown 8347 1726776618.14155: variable 'ansible_shell_executable' from source: unknown 8347 1726776618.14157: variable 'ansible_host' from source: host vars for 'managed_node3' 8347 1726776618.14160: variable 'ansible_pipelining' from source: unknown 8347 1726776618.14163: variable 'ansible_timeout' from source: unknown 8347 1726776618.14166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8347 1726776618.14224: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8347 1726776618.14244: variable 'omit' from source: magic vars 8347 1726776618.14250: starting attempt loop 8347 1726776618.14254: running the handler 8347 1726776618.14292: handler run complete 8347 1726776618.14302: attempt loop complete, returning result 8347 1726776618.14306: _execute() done 8347 1726776618.14309: dumping result to json 8347 1726776618.14311: done dumping result, returning 8347 1726776618.14315: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables [120fa90a-8a95-c4e4-06a7-00000000008d] 8347 1726776618.14320: sending task result for task 120fa90a-8a95-c4e4-06a7-00000000008d 8347 1726776618.14345: done sending task result for task 120fa90a-8a95-c4e4-06a7-00000000008d 8347 1726776618.14348: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__kernel_settings_packages": [ "tuned", "python3-configobj" ], "__kernel_settings_services": [ "tuned" ] }, "ansible_included_var_files": [ "/tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/default.yml" ], "changed": false } 8283 1726776618.14475: no more pending results, returning what we have 8283 1726776618.14477: results queue empty 8283 1726776618.14478: checking for any_errors_fatal 8283 1726776618.14482: done checking for any_errors_fatal 8283 1726776618.14483: checking for max_fail_percentage 8283 1726776618.14484: done checking for max_fail_percentage 8283 1726776618.14484: checking to see if all hosts have failed and the running result is not ok 8283 1726776618.14485: done checking to see if all hosts have failed 8283 1726776618.14485: getting the remaining hosts for this loop 8283 1726776618.14486: done getting the remaining hosts for this loop 8283 1726776618.14489: getting the next task for host managed_node3 8283 1726776618.14495: done getting next task for host managed_node3 8283 1726776618.14498: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8283 1726776618.14500: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776618.14507: getting variables 8283 1726776618.14508: in VariableManager get_vars() 8283 1726776618.14532: Calling all_inventory to load vars for managed_node3 8283 1726776618.14535: Calling groups_inventory to load vars for managed_node3 8283 1726776618.14536: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776618.14542: Calling all_plugins_play to load vars for managed_node3 8283 1726776618.14543: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776618.14545: Calling groups_plugins_play to load vars for managed_node3 8283 1726776618.14578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776618.14605: done with get_vars() 8283 1726776618.14610: done getting variables 8283 1726776618.14676: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required packages are installed] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 Thursday 19 September 2024 16:10:18 -0400 (0:00:00.022) 0:00:01.851 **** 8283 1726776618.14696: entering _queue_task() for managed_node3/package 8283 1726776618.14697: Creating lock for package 8283 1726776618.14839: worker is 1 (out of 1 available) 8283 1726776618.14853: exiting _queue_task() for managed_node3/package 8283 1726776618.14866: done queuing things up, now waiting for results queue to drain 8283 1726776618.14868: waiting for pending results... 8348 1726776618.14958: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8348 1726776618.15049: in run() - task 120fa90a-8a95-c4e4-06a7-000000000025 8348 1726776618.15063: variable 'ansible_search_path' from source: unknown 8348 1726776618.15067: variable 'ansible_search_path' from source: unknown 8348 1726776618.15091: calling self._execute() 8348 1726776618.15131: variable 'ansible_host' from source: host vars for 'managed_node3' 8348 1726776618.15140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8348 1726776618.15148: variable 'omit' from source: magic vars 8348 1726776618.15212: variable 'omit' from source: magic vars 8348 1726776618.15245: variable 'omit' from source: magic vars 8348 1726776618.15267: variable '__kernel_settings_packages' from source: include_vars 8348 1726776618.15458: variable '__kernel_settings_packages' from source: include_vars 8348 1726776618.15603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8348 1726776618.17070: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8348 1726776618.17123: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8348 1726776618.17155: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8348 1726776618.17182: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8348 1726776618.17204: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8348 1726776618.17269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8348 1726776618.17289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8348 1726776618.17309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8348 1726776618.17339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8348 1726776618.17350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8348 1726776618.17422: variable '__kernel_settings_is_ostree' from source: set_fact 8348 1726776618.17430: variable 'omit' from source: magic vars 8348 1726776618.17450: variable 'omit' from source: magic vars 8348 1726776618.17472: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8348 1726776618.17491: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8348 1726776618.17505: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8348 1726776618.17518: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8348 1726776618.17530: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8348 1726776618.17551: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8348 1726776618.17556: variable 'ansible_host' from source: host vars for 'managed_node3' 8348 1726776618.17563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8348 1726776618.17622: Set connection var ansible_module_compression to ZIP_DEFLATED 8348 1726776618.17634: Set connection var ansible_shell_type to sh 8348 1726776618.17641: Set connection var ansible_timeout to 10 8348 1726776618.17644: Set connection var ansible_connection to ssh 8348 1726776618.17649: Set connection var ansible_pipelining to False 8348 1726776618.17651: Set connection var ansible_shell_executable to /bin/sh 8348 1726776618.17666: variable 'ansible_shell_executable' from source: unknown 8348 1726776618.17669: variable 'ansible_connection' from source: unknown 8348 1726776618.17671: variable 'ansible_module_compression' from source: unknown 8348 1726776618.17673: variable 'ansible_shell_type' from source: unknown 8348 1726776618.17675: variable 'ansible_shell_executable' from source: unknown 8348 1726776618.17676: variable 'ansible_host' from source: host vars for 'managed_node3' 8348 1726776618.17678: variable 'ansible_pipelining' from source: unknown 8348 1726776618.17680: variable 'ansible_timeout' from source: unknown 8348 1726776618.17682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8348 1726776618.17738: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8348 1726776618.17747: variable 'omit' from source: magic vars 8348 1726776618.17751: starting attempt loop 8348 1726776618.17754: running the handler 8348 1726776618.17807: variable 'ansible_facts' from source: unknown 8348 1726776618.17832: _low_level_execute_command(): starting 8348 1726776618.17838: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8348 1726776618.20106: stdout chunk (state=2): >>>/root <<< 8348 1726776618.20232: stderr chunk (state=3): >>><<< 8348 1726776618.20239: stdout chunk (state=3): >>><<< 8348 1726776618.20254: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8348 1726776618.20265: _low_level_execute_command(): starting 8348 1726776618.20270: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776618.2026074-8348-142989447568705 `" && echo ansible-tmp-1726776618.2026074-8348-142989447568705="` echo /root/.ansible/tmp/ansible-tmp-1726776618.2026074-8348-142989447568705 `" ) && sleep 0' 8348 1726776618.22653: stdout chunk (state=2): >>>ansible-tmp-1726776618.2026074-8348-142989447568705=/root/.ansible/tmp/ansible-tmp-1726776618.2026074-8348-142989447568705 <<< 8348 1726776618.22778: stderr chunk (state=3): >>><<< 8348 1726776618.22784: stdout chunk (state=3): >>><<< 8348 1726776618.22796: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776618.2026074-8348-142989447568705=/root/.ansible/tmp/ansible-tmp-1726776618.2026074-8348-142989447568705 , stderr= 8348 1726776618.22815: variable 'ansible_module_compression' from source: unknown 8348 1726776618.22848: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 8348 1726776618.22894: variable 'ansible_facts' from source: unknown 8348 1726776618.23044: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776618.2026074-8348-142989447568705/AnsiballZ_setup.py 8348 1726776618.23141: Sending initial data 8348 1726776618.23148: Sent initial data (152 bytes) 8348 1726776618.25556: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpt7l2hxcy /root/.ansible/tmp/ansible-tmp-1726776618.2026074-8348-142989447568705/AnsiballZ_setup.py <<< 8348 1726776618.27355: stderr chunk (state=3): >>><<< 8348 1726776618.27364: stdout chunk (state=3): >>><<< 8348 1726776618.27384: done transferring module to remote 8348 1726776618.27394: _low_level_execute_command(): starting 8348 1726776618.27399: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776618.2026074-8348-142989447568705/ /root/.ansible/tmp/ansible-tmp-1726776618.2026074-8348-142989447568705/AnsiballZ_setup.py && sleep 0' 8348 1726776618.29725: stderr chunk (state=2): >>><<< 8348 1726776618.29734: stdout chunk (state=2): >>><<< 8348 1726776618.29747: _low_level_execute_command() done: rc=0, stdout=, stderr= 8348 1726776618.29751: _low_level_execute_command(): starting 8348 1726776618.29756: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776618.2026074-8348-142989447568705/AnsiballZ_setup.py && sleep 0' 8348 1726776618.74758: stdout chunk (state=2): >>> {"ansible_facts": {"ansible_pkg_mgr": "dnf"}, "invocation": {"module_args": {"filter": ["ansible_pkg_mgr"], "gather_subset": ["!all"], "gather_timeout": 10, "fact_path": "/etc/ansible/facts.d"}}} <<< 8348 1726776618.76461: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 8348 1726776618.76507: stderr chunk (state=3): >>><<< 8348 1726776618.76516: stdout chunk (state=3): >>><<< 8348 1726776618.76535: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_pkg_mgr": "dnf"}, "invocation": {"module_args": {"filter": ["ansible_pkg_mgr"], "gather_subset": ["!all"], "gather_timeout": 10, "fact_path": "/etc/ansible/facts.d"}}} , stderr=Shared connection to 10.31.8.186 closed. 8348 1726776618.76562: done with _execute_module (ansible.legacy.setup, {'filter': 'ansible_pkg_mgr', 'gather_subset': '!all', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776618.2026074-8348-142989447568705/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8348 1726776618.76576: Facts {'ansible_facts': {'ansible_pkg_mgr': 'dnf'}, 'invocation': {'module_args': {'filter': ['ansible_pkg_mgr'], 'gather_subset': ['!all'], 'gather_timeout': 10, 'fact_path': '/etc/ansible/facts.d'}}, '_ansible_parsed': True} 8348 1726776618.76630: variable 'ansible_module_compression' from source: unknown 8348 1726776618.76670: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 8348 1726776618.76675: ANSIBALLZ: Acquiring lock 8348 1726776618.76679: ANSIBALLZ: Lock acquired: 140690877500448 8348 1726776618.76682: ANSIBALLZ: Creating module 8348 1726776618.89064: ANSIBALLZ: Writing module into payload 8348 1726776618.89263: ANSIBALLZ: Writing module 8348 1726776618.89283: ANSIBALLZ: Renaming module 8348 1726776618.89290: ANSIBALLZ: Done creating module 8348 1726776618.89303: variable 'ansible_facts' from source: unknown 8348 1726776618.89375: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776618.2026074-8348-142989447568705/AnsiballZ_dnf.py 8348 1726776618.89464: Sending initial data 8348 1726776618.89471: Sent initial data (150 bytes) 8348 1726776618.91993: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmprm15vcix /root/.ansible/tmp/ansible-tmp-1726776618.2026074-8348-142989447568705/AnsiballZ_dnf.py <<< 8348 1726776618.93226: stderr chunk (state=3): >>><<< 8348 1726776618.93234: stdout chunk (state=3): >>><<< 8348 1726776618.93250: done transferring module to remote 8348 1726776618.93259: _low_level_execute_command(): starting 8348 1726776618.93266: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776618.2026074-8348-142989447568705/ /root/.ansible/tmp/ansible-tmp-1726776618.2026074-8348-142989447568705/AnsiballZ_dnf.py && sleep 0' 8348 1726776618.95607: stderr chunk (state=2): >>><<< 8348 1726776618.95614: stdout chunk (state=2): >>><<< 8348 1726776618.95627: _low_level_execute_command() done: rc=0, stdout=, stderr= 8348 1726776618.95633: _low_level_execute_command(): starting 8348 1726776618.95639: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776618.2026074-8348-142989447568705/AnsiballZ_dnf.py && sleep 0' 8348 1726776624.42135: stdout chunk (state=2): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} <<< 8348 1726776624.50254: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 8348 1726776624.50306: stderr chunk (state=3): >>><<< 8348 1726776624.50314: stdout chunk (state=3): >>><<< 8348 1726776624.50332: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} , stderr=Shared connection to 10.31.8.186 closed. 8348 1726776624.50370: done with _execute_module (ansible.legacy.dnf, {'name': ['tuned', 'python3-configobj'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776618.2026074-8348-142989447568705/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8348 1726776624.50379: _low_level_execute_command(): starting 8348 1726776624.50385: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776618.2026074-8348-142989447568705/ > /dev/null 2>&1 && sleep 0' 8348 1726776624.53046: stderr chunk (state=2): >>><<< 8348 1726776624.53054: stdout chunk (state=2): >>><<< 8348 1726776624.53066: _low_level_execute_command() done: rc=0, stdout=, stderr= 8348 1726776624.53077: handler run complete 8348 1726776624.53100: attempt loop complete, returning result 8348 1726776624.53103: _execute() done 8348 1726776624.53105: dumping result to json 8348 1726776624.53110: done dumping result, returning 8348 1726776624.53115: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed [120fa90a-8a95-c4e4-06a7-000000000025] 8348 1726776624.53119: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000025 8348 1726776624.53160: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000025 8348 1726776624.53164: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 8283 1726776624.53318: no more pending results, returning what we have 8283 1726776624.53320: results queue empty 8283 1726776624.53321: checking for any_errors_fatal 8283 1726776624.53328: done checking for any_errors_fatal 8283 1726776624.53330: checking for max_fail_percentage 8283 1726776624.53331: done checking for max_fail_percentage 8283 1726776624.53332: checking to see if all hosts have failed and the running result is not ok 8283 1726776624.53332: done checking to see if all hosts have failed 8283 1726776624.53333: getting the remaining hosts for this loop 8283 1726776624.53334: done getting the remaining hosts for this loop 8283 1726776624.53337: getting the next task for host managed_node3 8283 1726776624.53346: done getting next task for host managed_node3 8283 1726776624.53349: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8283 1726776624.53351: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776624.53359: getting variables 8283 1726776624.53361: in VariableManager get_vars() 8283 1726776624.53390: Calling all_inventory to load vars for managed_node3 8283 1726776624.53392: Calling groups_inventory to load vars for managed_node3 8283 1726776624.53394: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776624.53402: Calling all_plugins_play to load vars for managed_node3 8283 1726776624.53404: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776624.53406: Calling groups_plugins_play to load vars for managed_node3 8283 1726776624.53453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776624.53489: done with get_vars() 8283 1726776624.53498: done getting variables 8283 1726776624.53569: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:24 Thursday 19 September 2024 16:10:24 -0400 (0:00:06.388) 0:00:08.240 **** 8283 1726776624.53592: entering _queue_task() for managed_node3/debug 8283 1726776624.53593: Creating lock for debug 8283 1726776624.53748: worker is 1 (out of 1 available) 8283 1726776624.53761: exiting _queue_task() for managed_node3/debug 8283 1726776624.53771: done queuing things up, now waiting for results queue to drain 8283 1726776624.53773: waiting for pending results... 8502 1726776624.53902: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8502 1726776624.54032: in run() - task 120fa90a-8a95-c4e4-06a7-000000000027 8502 1726776624.54050: variable 'ansible_search_path' from source: unknown 8502 1726776624.54055: variable 'ansible_search_path' from source: unknown 8502 1726776624.54088: calling self._execute() 8502 1726776624.54147: variable 'ansible_host' from source: host vars for 'managed_node3' 8502 1726776624.54157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8502 1726776624.54165: variable 'omit' from source: magic vars 8502 1726776624.54617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8502 1726776624.56860: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8502 1726776624.56938: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8502 1726776624.56973: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8502 1726776624.57006: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8502 1726776624.57031: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8502 1726776624.57098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8502 1726776624.57126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8502 1726776624.57180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8502 1726776624.57220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8502 1726776624.57236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8502 1726776624.57341: variable '__kernel_settings_is_transactional' from source: set_fact 8502 1726776624.57358: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 8502 1726776624.57362: when evaluation is False, skipping this task 8502 1726776624.57365: _execute() done 8502 1726776624.57368: dumping result to json 8502 1726776624.57372: done dumping result, returning 8502 1726776624.57379: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes [120fa90a-8a95-c4e4-06a7-000000000027] 8502 1726776624.57384: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000027 8502 1726776624.57411: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000027 8502 1726776624.57414: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "__kernel_settings_is_transactional | d(false)" } 8283 1726776624.57742: no more pending results, returning what we have 8283 1726776624.57745: results queue empty 8283 1726776624.57745: checking for any_errors_fatal 8283 1726776624.57751: done checking for any_errors_fatal 8283 1726776624.57752: checking for max_fail_percentage 8283 1726776624.57753: done checking for max_fail_percentage 8283 1726776624.57753: checking to see if all hosts have failed and the running result is not ok 8283 1726776624.57754: done checking to see if all hosts have failed 8283 1726776624.57755: getting the remaining hosts for this loop 8283 1726776624.57756: done getting the remaining hosts for this loop 8283 1726776624.57759: getting the next task for host managed_node3 8283 1726776624.57765: done getting next task for host managed_node3 8283 1726776624.57768: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8283 1726776624.57770: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776624.57782: getting variables 8283 1726776624.57784: in VariableManager get_vars() 8283 1726776624.57813: Calling all_inventory to load vars for managed_node3 8283 1726776624.57816: Calling groups_inventory to load vars for managed_node3 8283 1726776624.57818: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776624.57826: Calling all_plugins_play to load vars for managed_node3 8283 1726776624.57831: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776624.57834: Calling groups_plugins_play to load vars for managed_node3 8283 1726776624.57882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776624.57921: done with get_vars() 8283 1726776624.57931: done getting variables 8283 1726776624.58057: Loading ActionModule 'reboot' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Reboot transactional update systems] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:29 Thursday 19 September 2024 16:10:24 -0400 (0:00:00.044) 0:00:08.285 **** 8283 1726776624.58087: entering _queue_task() for managed_node3/reboot 8283 1726776624.58089: Creating lock for reboot 8283 1726776624.58321: worker is 1 (out of 1 available) 8283 1726776624.58336: exiting _queue_task() for managed_node3/reboot 8283 1726776624.58347: done queuing things up, now waiting for results queue to drain 8283 1726776624.58349: waiting for pending results... 8505 1726776624.58537: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8505 1726776624.58656: in run() - task 120fa90a-8a95-c4e4-06a7-000000000028 8505 1726776624.58673: variable 'ansible_search_path' from source: unknown 8505 1726776624.58677: variable 'ansible_search_path' from source: unknown 8505 1726776624.58707: calling self._execute() 8505 1726776624.58764: variable 'ansible_host' from source: host vars for 'managed_node3' 8505 1726776624.58773: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8505 1726776624.58781: variable 'omit' from source: magic vars 8505 1726776624.59211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8505 1726776624.61415: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8505 1726776624.61482: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8505 1726776624.61520: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8505 1726776624.61554: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8505 1726776624.61579: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8505 1726776624.61672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8505 1726776624.61702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8505 1726776624.61727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8505 1726776624.61770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8505 1726776624.61785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8505 1726776624.61888: variable '__kernel_settings_is_transactional' from source: set_fact 8505 1726776624.61906: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 8505 1726776624.61910: when evaluation is False, skipping this task 8505 1726776624.61914: _execute() done 8505 1726776624.61917: dumping result to json 8505 1726776624.61921: done dumping result, returning 8505 1726776624.61927: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems [120fa90a-8a95-c4e4-06a7-000000000028] 8505 1726776624.61936: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000028 8505 1726776624.61964: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000028 8505 1726776624.61967: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8283 1726776624.62326: no more pending results, returning what we have 8283 1726776624.62331: results queue empty 8283 1726776624.62332: checking for any_errors_fatal 8283 1726776624.62335: done checking for any_errors_fatal 8283 1726776624.62336: checking for max_fail_percentage 8283 1726776624.62337: done checking for max_fail_percentage 8283 1726776624.62338: checking to see if all hosts have failed and the running result is not ok 8283 1726776624.62339: done checking to see if all hosts have failed 8283 1726776624.62339: getting the remaining hosts for this loop 8283 1726776624.62340: done getting the remaining hosts for this loop 8283 1726776624.62343: getting the next task for host managed_node3 8283 1726776624.62349: done getting next task for host managed_node3 8283 1726776624.62352: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8283 1726776624.62355: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776624.62367: getting variables 8283 1726776624.62368: in VariableManager get_vars() 8283 1726776624.62399: Calling all_inventory to load vars for managed_node3 8283 1726776624.62401: Calling groups_inventory to load vars for managed_node3 8283 1726776624.62404: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776624.62412: Calling all_plugins_play to load vars for managed_node3 8283 1726776624.62414: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776624.62417: Calling groups_plugins_play to load vars for managed_node3 8283 1726776624.62468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776624.62508: done with get_vars() 8283 1726776624.62515: done getting variables 8283 1726776624.62570: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:34 Thursday 19 September 2024 16:10:24 -0400 (0:00:00.045) 0:00:08.330 **** 8283 1726776624.62600: entering _queue_task() for managed_node3/fail 8283 1726776624.62788: worker is 1 (out of 1 available) 8283 1726776624.62800: exiting _queue_task() for managed_node3/fail 8283 1726776624.62810: done queuing things up, now waiting for results queue to drain 8283 1726776624.62811: waiting for pending results... 8506 1726776624.63004: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8506 1726776624.63123: in run() - task 120fa90a-8a95-c4e4-06a7-000000000029 8506 1726776624.63142: variable 'ansible_search_path' from source: unknown 8506 1726776624.63146: variable 'ansible_search_path' from source: unknown 8506 1726776624.63176: calling self._execute() 8506 1726776624.63232: variable 'ansible_host' from source: host vars for 'managed_node3' 8506 1726776624.63240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8506 1726776624.63249: variable 'omit' from source: magic vars 8506 1726776624.63668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8506 1726776624.65832: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8506 1726776624.65894: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8506 1726776624.65931: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8506 1726776624.65965: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8506 1726776624.66002: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8506 1726776624.66085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8506 1726776624.66109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8506 1726776624.66134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8506 1726776624.66169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8506 1726776624.66180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8506 1726776624.66271: variable '__kernel_settings_is_transactional' from source: set_fact 8506 1726776624.66288: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 8506 1726776624.66291: when evaluation is False, skipping this task 8506 1726776624.66294: _execute() done 8506 1726776624.66297: dumping result to json 8506 1726776624.66300: done dumping result, returning 8506 1726776624.66305: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set [120fa90a-8a95-c4e4-06a7-000000000029] 8506 1726776624.66311: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000029 8506 1726776624.66336: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000029 8506 1726776624.66339: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8283 1726776624.66648: no more pending results, returning what we have 8283 1726776624.66650: results queue empty 8283 1726776624.66651: checking for any_errors_fatal 8283 1726776624.66655: done checking for any_errors_fatal 8283 1726776624.66655: checking for max_fail_percentage 8283 1726776624.66657: done checking for max_fail_percentage 8283 1726776624.66657: checking to see if all hosts have failed and the running result is not ok 8283 1726776624.66658: done checking to see if all hosts have failed 8283 1726776624.66658: getting the remaining hosts for this loop 8283 1726776624.66659: done getting the remaining hosts for this loop 8283 1726776624.66662: getting the next task for host managed_node3 8283 1726776624.66668: done getting next task for host managed_node3 8283 1726776624.66672: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8283 1726776624.66674: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776624.66684: getting variables 8283 1726776624.66685: in VariableManager get_vars() 8283 1726776624.66712: Calling all_inventory to load vars for managed_node3 8283 1726776624.66714: Calling groups_inventory to load vars for managed_node3 8283 1726776624.66716: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776624.66722: Calling all_plugins_play to load vars for managed_node3 8283 1726776624.66725: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776624.66727: Calling groups_plugins_play to load vars for managed_node3 8283 1726776624.66773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776624.66809: done with get_vars() 8283 1726776624.66816: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Read tuned main config] ****** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42 Thursday 19 September 2024 16:10:24 -0400 (0:00:00.042) 0:00:08.373 **** 8283 1726776624.66887: entering _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config 8283 1726776624.66888: Creating lock for fedora.linux_system_roles.kernel_settings_get_config 8283 1726776624.67079: worker is 1 (out of 1 available) 8283 1726776624.67092: exiting _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config 8283 1726776624.67103: done queuing things up, now waiting for results queue to drain 8283 1726776624.67104: waiting for pending results... 8508 1726776624.67314: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8508 1726776624.67432: in run() - task 120fa90a-8a95-c4e4-06a7-00000000002b 8508 1726776624.67450: variable 'ansible_search_path' from source: unknown 8508 1726776624.67455: variable 'ansible_search_path' from source: unknown 8508 1726776624.67486: calling self._execute() 8508 1726776624.67542: variable 'ansible_host' from source: host vars for 'managed_node3' 8508 1726776624.67551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8508 1726776624.67559: variable 'omit' from source: magic vars 8508 1726776624.67651: variable 'omit' from source: magic vars 8508 1726776624.67695: variable 'omit' from source: magic vars 8508 1726776624.67721: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 8508 1726776624.67983: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 8508 1726776624.68062: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8508 1726776624.68097: variable 'omit' from source: magic vars 8508 1726776624.68139: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8508 1726776624.68171: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8508 1726776624.68191: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8508 1726776624.68208: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8508 1726776624.68221: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8508 1726776624.68306: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8508 1726776624.68312: variable 'ansible_host' from source: host vars for 'managed_node3' 8508 1726776624.68316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8508 1726776624.68414: Set connection var ansible_module_compression to ZIP_DEFLATED 8508 1726776624.68422: Set connection var ansible_shell_type to sh 8508 1726776624.68431: Set connection var ansible_timeout to 10 8508 1726776624.68437: Set connection var ansible_connection to ssh 8508 1726776624.68444: Set connection var ansible_pipelining to False 8508 1726776624.68450: Set connection var ansible_shell_executable to /bin/sh 8508 1726776624.68468: variable 'ansible_shell_executable' from source: unknown 8508 1726776624.68473: variable 'ansible_connection' from source: unknown 8508 1726776624.68477: variable 'ansible_module_compression' from source: unknown 8508 1726776624.68480: variable 'ansible_shell_type' from source: unknown 8508 1726776624.68483: variable 'ansible_shell_executable' from source: unknown 8508 1726776624.68485: variable 'ansible_host' from source: host vars for 'managed_node3' 8508 1726776624.68489: variable 'ansible_pipelining' from source: unknown 8508 1726776624.68492: variable 'ansible_timeout' from source: unknown 8508 1726776624.68495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8508 1726776624.68656: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 8508 1726776624.68666: variable 'omit' from source: magic vars 8508 1726776624.68672: starting attempt loop 8508 1726776624.68675: running the handler 8508 1726776624.68686: _low_level_execute_command(): starting 8508 1726776624.68693: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8508 1726776624.71197: stdout chunk (state=2): >>>/root <<< 8508 1726776624.71332: stderr chunk (state=3): >>><<< 8508 1726776624.71339: stdout chunk (state=3): >>><<< 8508 1726776624.71357: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8508 1726776624.71371: _low_level_execute_command(): starting 8508 1726776624.71378: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776624.713653-8508-192963853828424 `" && echo ansible-tmp-1726776624.713653-8508-192963853828424="` echo /root/.ansible/tmp/ansible-tmp-1726776624.713653-8508-192963853828424 `" ) && sleep 0' 8508 1726776624.74434: stdout chunk (state=2): >>>ansible-tmp-1726776624.713653-8508-192963853828424=/root/.ansible/tmp/ansible-tmp-1726776624.713653-8508-192963853828424 <<< 8508 1726776624.74529: stderr chunk (state=3): >>><<< 8508 1726776624.74537: stdout chunk (state=3): >>><<< 8508 1726776624.74552: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776624.713653-8508-192963853828424=/root/.ansible/tmp/ansible-tmp-1726776624.713653-8508-192963853828424 , stderr= 8508 1726776624.74592: variable 'ansible_module_compression' from source: unknown 8508 1726776624.74628: ANSIBALLZ: Using lock for fedora.linux_system_roles.kernel_settings_get_config 8508 1726776624.74634: ANSIBALLZ: Acquiring lock 8508 1726776624.74637: ANSIBALLZ: Lock acquired: 140690875467872 8508 1726776624.74640: ANSIBALLZ: Creating module 8508 1726776624.87334: ANSIBALLZ: Writing module into payload 8508 1726776624.87417: ANSIBALLZ: Writing module 8508 1726776624.87441: ANSIBALLZ: Renaming module 8508 1726776624.87448: ANSIBALLZ: Done creating module 8508 1726776624.87471: variable 'ansible_facts' from source: unknown 8508 1726776624.87541: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776624.713653-8508-192963853828424/AnsiballZ_kernel_settings_get_config.py 8508 1726776624.88141: Sending initial data 8508 1726776624.88151: Sent initial data (172 bytes) 8508 1726776624.91450: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmp91o7c95m /root/.ansible/tmp/ansible-tmp-1726776624.713653-8508-192963853828424/AnsiballZ_kernel_settings_get_config.py <<< 8508 1726776624.93722: stderr chunk (state=3): >>><<< 8508 1726776624.93736: stdout chunk (state=3): >>><<< 8508 1726776624.93758: done transferring module to remote 8508 1726776624.93774: _low_level_execute_command(): starting 8508 1726776624.93780: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776624.713653-8508-192963853828424/ /root/.ansible/tmp/ansible-tmp-1726776624.713653-8508-192963853828424/AnsiballZ_kernel_settings_get_config.py && sleep 0' 8508 1726776624.97098: stderr chunk (state=2): >>><<< 8508 1726776624.97107: stdout chunk (state=2): >>><<< 8508 1726776624.97122: _low_level_execute_command() done: rc=0, stdout=, stderr= 8508 1726776624.97127: _low_level_execute_command(): starting 8508 1726776624.97135: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776624.713653-8508-192963853828424/AnsiballZ_kernel_settings_get_config.py && sleep 0' 8508 1726776625.13644: stdout chunk (state=2): >>> {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} <<< 8508 1726776625.14680: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 8508 1726776625.14731: stderr chunk (state=3): >>><<< 8508 1726776625.14739: stdout chunk (state=3): >>><<< 8508 1726776625.14757: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} , stderr=Shared connection to 10.31.8.186 closed. 8508 1726776625.14786: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/tuned-main.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776624.713653-8508-192963853828424/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8508 1726776625.14796: _low_level_execute_command(): starting 8508 1726776625.14802: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776624.713653-8508-192963853828424/ > /dev/null 2>&1 && sleep 0' 8508 1726776625.17296: stderr chunk (state=2): >>><<< 8508 1726776625.17307: stdout chunk (state=2): >>><<< 8508 1726776625.17322: _low_level_execute_command() done: rc=0, stdout=, stderr= 8508 1726776625.17332: handler run complete 8508 1726776625.17346: attempt loop complete, returning result 8508 1726776625.17350: _execute() done 8508 1726776625.17353: dumping result to json 8508 1726776625.17358: done dumping result, returning 8508 1726776625.17365: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config [120fa90a-8a95-c4e4-06a7-00000000002b] 8508 1726776625.17370: sending task result for task 120fa90a-8a95-c4e4-06a7-00000000002b 8508 1726776625.17401: done sending task result for task 120fa90a-8a95-c4e4-06a7-00000000002b 8508 1726776625.17405: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "data": { "daemon": "1", "default_instance_priority": "0", "dynamic_tuning": "0", "log_file_count": "2", "log_file_max_size": "1MB", "reapply_sysctl": "1", "recommend_command": "1", "sleep_interval": "1", "udev_buffer_size": "1MB", "update_interval": "10" } } 8283 1726776625.17563: no more pending results, returning what we have 8283 1726776625.17565: results queue empty 8283 1726776625.17566: checking for any_errors_fatal 8283 1726776625.17573: done checking for any_errors_fatal 8283 1726776625.17573: checking for max_fail_percentage 8283 1726776625.17575: done checking for max_fail_percentage 8283 1726776625.17576: checking to see if all hosts have failed and the running result is not ok 8283 1726776625.17576: done checking to see if all hosts have failed 8283 1726776625.17577: getting the remaining hosts for this loop 8283 1726776625.17578: done getting the remaining hosts for this loop 8283 1726776625.17581: getting the next task for host managed_node3 8283 1726776625.17588: done getting next task for host managed_node3 8283 1726776625.17591: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8283 1726776625.17593: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776625.17602: getting variables 8283 1726776625.17603: in VariableManager get_vars() 8283 1726776625.17633: Calling all_inventory to load vars for managed_node3 8283 1726776625.17636: Calling groups_inventory to load vars for managed_node3 8283 1726776625.17638: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776625.17646: Calling all_plugins_play to load vars for managed_node3 8283 1726776625.17648: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776625.17650: Calling groups_plugins_play to load vars for managed_node3 8283 1726776625.17696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776625.17723: done with get_vars() 8283 1726776625.17732: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 Thursday 19 September 2024 16:10:25 -0400 (0:00:00.509) 0:00:08.882 **** 8283 1726776625.17801: entering _queue_task() for managed_node3/stat 8283 1726776625.17997: worker is 1 (out of 1 available) 8283 1726776625.18010: exiting _queue_task() for managed_node3/stat 8283 1726776625.18021: done queuing things up, now waiting for results queue to drain 8283 1726776625.18023: waiting for pending results... 8541 1726776625.18251: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8541 1726776625.18379: in run() - task 120fa90a-8a95-c4e4-06a7-00000000002c 8541 1726776625.18397: variable 'ansible_search_path' from source: unknown 8541 1726776625.18401: variable 'ansible_search_path' from source: unknown 8541 1726776625.18443: variable '__prof_from_conf' from source: task vars 8541 1726776625.18726: variable '__prof_from_conf' from source: task vars 8541 1726776625.18864: variable '__data' from source: task vars 8541 1726776625.18918: variable '__kernel_settings_register_tuned_main' from source: set_fact 8541 1726776625.19060: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8541 1726776625.19073: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8541 1726776625.19116: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8541 1726776625.19172: variable 'omit' from source: magic vars 8541 1726776625.19244: variable 'ansible_host' from source: host vars for 'managed_node3' 8541 1726776625.19253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8541 1726776625.19262: variable 'omit' from source: magic vars 8541 1726776625.19432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8541 1726776625.20944: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8541 1726776625.20991: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8541 1726776625.21020: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8541 1726776625.21050: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8541 1726776625.21070: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8541 1726776625.21125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8541 1726776625.21151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8541 1726776625.21170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8541 1726776625.21198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8541 1726776625.21210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8541 1726776625.21293: variable 'item' from source: unknown 8541 1726776625.21307: Evaluated conditional (item | length > 0): False 8541 1726776625.21312: when evaluation is False, skipping this task 8541 1726776625.21359: variable 'item' from source: unknown 8541 1726776625.21419: variable 'item' from source: unknown skipping: [managed_node3] => (item=) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item | length > 0", "item": "", "skip_reason": "Conditional result was False" } 8541 1726776625.21489: variable 'ansible_host' from source: host vars for 'managed_node3' 8541 1726776625.21498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8541 1726776625.21506: variable 'omit' from source: magic vars 8541 1726776625.21636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8541 1726776625.21660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8541 1726776625.21861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8541 1726776625.21899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8541 1726776625.21913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8541 1726776625.21985: variable 'item' from source: unknown 8541 1726776625.21995: Evaluated conditional (item | length > 0): True 8541 1726776625.22001: variable 'omit' from source: magic vars 8541 1726776625.22038: variable 'omit' from source: magic vars 8541 1726776625.22079: variable 'item' from source: unknown 8541 1726776625.22141: variable 'item' from source: unknown 8541 1726776625.22157: variable 'omit' from source: magic vars 8541 1726776625.22190: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8541 1726776625.22215: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8541 1726776625.22233: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8541 1726776625.22250: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8541 1726776625.22260: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8541 1726776625.22287: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8541 1726776625.22293: variable 'ansible_host' from source: host vars for 'managed_node3' 8541 1726776625.22297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8541 1726776625.22386: Set connection var ansible_module_compression to ZIP_DEFLATED 8541 1726776625.22395: Set connection var ansible_shell_type to sh 8541 1726776625.22402: Set connection var ansible_timeout to 10 8541 1726776625.22407: Set connection var ansible_connection to ssh 8541 1726776625.22415: Set connection var ansible_pipelining to False 8541 1726776625.22420: Set connection var ansible_shell_executable to /bin/sh 8541 1726776625.22606: variable 'ansible_shell_executable' from source: unknown 8541 1726776625.22612: variable 'ansible_connection' from source: unknown 8541 1726776625.22616: variable 'ansible_module_compression' from source: unknown 8541 1726776625.22619: variable 'ansible_shell_type' from source: unknown 8541 1726776625.22621: variable 'ansible_shell_executable' from source: unknown 8541 1726776625.22624: variable 'ansible_host' from source: host vars for 'managed_node3' 8541 1726776625.22628: variable 'ansible_pipelining' from source: unknown 8541 1726776625.22632: variable 'ansible_timeout' from source: unknown 8541 1726776625.22636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8541 1726776625.22757: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 8541 1726776625.22768: variable 'omit' from source: magic vars 8541 1726776625.22773: starting attempt loop 8541 1726776625.22776: running the handler 8541 1726776625.22787: _low_level_execute_command(): starting 8541 1726776625.22794: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8541 1726776625.25472: stdout chunk (state=2): >>>/root <<< 8541 1726776625.25562: stderr chunk (state=3): >>><<< 8541 1726776625.25571: stdout chunk (state=3): >>><<< 8541 1726776625.25592: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8541 1726776625.25606: _low_level_execute_command(): starting 8541 1726776625.25612: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776625.2560098-8541-280225864612734 `" && echo ansible-tmp-1726776625.2560098-8541-280225864612734="` echo /root/.ansible/tmp/ansible-tmp-1726776625.2560098-8541-280225864612734 `" ) && sleep 0' 8541 1726776625.28502: stdout chunk (state=2): >>>ansible-tmp-1726776625.2560098-8541-280225864612734=/root/.ansible/tmp/ansible-tmp-1726776625.2560098-8541-280225864612734 <<< 8541 1726776625.28514: stderr chunk (state=2): >>><<< 8541 1726776625.28527: stdout chunk (state=3): >>><<< 8541 1726776625.28543: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776625.2560098-8541-280225864612734=/root/.ansible/tmp/ansible-tmp-1726776625.2560098-8541-280225864612734 , stderr= 8541 1726776625.28585: variable 'ansible_module_compression' from source: unknown 8541 1726776625.28641: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8541 1726776625.28675: variable 'ansible_facts' from source: unknown 8541 1726776625.28779: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776625.2560098-8541-280225864612734/AnsiballZ_stat.py 8541 1726776625.29257: Sending initial data 8541 1726776625.29265: Sent initial data (151 bytes) 8541 1726776625.33918: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmp17dp__4b /root/.ansible/tmp/ansible-tmp-1726776625.2560098-8541-280225864612734/AnsiballZ_stat.py <<< 8541 1726776625.35156: stderr chunk (state=3): >>><<< 8541 1726776625.35165: stdout chunk (state=3): >>><<< 8541 1726776625.35184: done transferring module to remote 8541 1726776625.35192: _low_level_execute_command(): starting 8541 1726776625.35196: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776625.2560098-8541-280225864612734/ /root/.ansible/tmp/ansible-tmp-1726776625.2560098-8541-280225864612734/AnsiballZ_stat.py && sleep 0' 8541 1726776625.37521: stderr chunk (state=2): >>><<< 8541 1726776625.37530: stdout chunk (state=2): >>><<< 8541 1726776625.37552: _low_level_execute_command() done: rc=0, stdout=, stderr= 8541 1726776625.37558: _low_level_execute_command(): starting 8541 1726776625.37561: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776625.2560098-8541-280225864612734/AnsiballZ_stat.py && sleep 0' 8541 1726776625.52466: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8541 1726776625.53523: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 8541 1726776625.53576: stderr chunk (state=3): >>><<< 8541 1726776625.53583: stdout chunk (state=3): >>><<< 8541 1726776625.53598: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.8.186 closed. 8541 1726776625.53619: done with _execute_module (stat, {'path': '/etc/tuned/profiles', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776625.2560098-8541-280225864612734/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8541 1726776625.53630: _low_level_execute_command(): starting 8541 1726776625.53636: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776625.2560098-8541-280225864612734/ > /dev/null 2>&1 && sleep 0' 8541 1726776625.56050: stderr chunk (state=2): >>><<< 8541 1726776625.56059: stdout chunk (state=2): >>><<< 8541 1726776625.56076: _low_level_execute_command() done: rc=0, stdout=, stderr= 8541 1726776625.56082: handler run complete 8541 1726776625.56098: attempt loop complete, returning result 8541 1726776625.56114: variable 'item' from source: unknown 8541 1726776625.56181: variable 'item' from source: unknown ok: [managed_node3] => (item=/etc/tuned/profiles) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned/profiles", "stat": { "exists": false } } 8541 1726776625.56269: variable 'ansible_host' from source: host vars for 'managed_node3' 8541 1726776625.56281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8541 1726776625.56291: variable 'omit' from source: magic vars 8541 1726776625.56399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8541 1726776625.56423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8541 1726776625.56443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8541 1726776625.56472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8541 1726776625.56484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8541 1726776625.56547: variable 'item' from source: unknown 8541 1726776625.56556: Evaluated conditional (item | length > 0): True 8541 1726776625.56561: variable 'omit' from source: magic vars 8541 1726776625.56575: variable 'omit' from source: magic vars 8541 1726776625.56602: variable 'item' from source: unknown 8541 1726776625.56652: variable 'item' from source: unknown 8541 1726776625.56666: variable 'omit' from source: magic vars 8541 1726776625.56684: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8541 1726776625.56693: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8541 1726776625.56699: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8541 1726776625.56712: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8541 1726776625.56716: variable 'ansible_host' from source: host vars for 'managed_node3' 8541 1726776625.56720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8541 1726776625.56772: Set connection var ansible_module_compression to ZIP_DEFLATED 8541 1726776625.56779: Set connection var ansible_shell_type to sh 8541 1726776625.56786: Set connection var ansible_timeout to 10 8541 1726776625.56791: Set connection var ansible_connection to ssh 8541 1726776625.56797: Set connection var ansible_pipelining to False 8541 1726776625.56802: Set connection var ansible_shell_executable to /bin/sh 8541 1726776625.56816: variable 'ansible_shell_executable' from source: unknown 8541 1726776625.56819: variable 'ansible_connection' from source: unknown 8541 1726776625.56822: variable 'ansible_module_compression' from source: unknown 8541 1726776625.56825: variable 'ansible_shell_type' from source: unknown 8541 1726776625.56830: variable 'ansible_shell_executable' from source: unknown 8541 1726776625.56833: variable 'ansible_host' from source: host vars for 'managed_node3' 8541 1726776625.56838: variable 'ansible_pipelining' from source: unknown 8541 1726776625.56841: variable 'ansible_timeout' from source: unknown 8541 1726776625.56845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8541 1726776625.56910: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8541 1726776625.56921: variable 'omit' from source: magic vars 8541 1726776625.56927: starting attempt loop 8541 1726776625.56932: running the handler 8541 1726776625.56939: _low_level_execute_command(): starting 8541 1726776625.56943: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8541 1726776625.59104: stdout chunk (state=2): >>>/root <<< 8541 1726776625.59232: stderr chunk (state=3): >>><<< 8541 1726776625.59240: stdout chunk (state=3): >>><<< 8541 1726776625.59255: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8541 1726776625.59264: _low_level_execute_command(): starting 8541 1726776625.59272: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776625.5926137-8541-186369952402186 `" && echo ansible-tmp-1726776625.5926137-8541-186369952402186="` echo /root/.ansible/tmp/ansible-tmp-1726776625.5926137-8541-186369952402186 `" ) && sleep 0' 8541 1726776625.61707: stdout chunk (state=2): >>>ansible-tmp-1726776625.5926137-8541-186369952402186=/root/.ansible/tmp/ansible-tmp-1726776625.5926137-8541-186369952402186 <<< 8541 1726776625.61836: stderr chunk (state=3): >>><<< 8541 1726776625.61843: stdout chunk (state=3): >>><<< 8541 1726776625.61856: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776625.5926137-8541-186369952402186=/root/.ansible/tmp/ansible-tmp-1726776625.5926137-8541-186369952402186 , stderr= 8541 1726776625.61887: variable 'ansible_module_compression' from source: unknown 8541 1726776625.61925: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8541 1726776625.61943: variable 'ansible_facts' from source: unknown 8541 1726776625.61999: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776625.5926137-8541-186369952402186/AnsiballZ_stat.py 8541 1726776625.62089: Sending initial data 8541 1726776625.62096: Sent initial data (151 bytes) 8541 1726776625.64512: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmphvufwe89 /root/.ansible/tmp/ansible-tmp-1726776625.5926137-8541-186369952402186/AnsiballZ_stat.py <<< 8541 1726776625.65593: stderr chunk (state=3): >>><<< 8541 1726776625.65599: stdout chunk (state=3): >>><<< 8541 1726776625.65615: done transferring module to remote 8541 1726776625.65623: _low_level_execute_command(): starting 8541 1726776625.65630: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776625.5926137-8541-186369952402186/ /root/.ansible/tmp/ansible-tmp-1726776625.5926137-8541-186369952402186/AnsiballZ_stat.py && sleep 0' 8541 1726776625.67913: stderr chunk (state=2): >>><<< 8541 1726776625.67920: stdout chunk (state=2): >>><<< 8541 1726776625.67934: _low_level_execute_command() done: rc=0, stdout=, stderr= 8541 1726776625.67940: _low_level_execute_command(): starting 8541 1726776625.67946: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776625.5926137-8541-186369952402186/AnsiballZ_stat.py && sleep 0' 8541 1726776625.83505: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 136, "inode": 917919, "dev": 51713, "nlink": 3, "atime": 1726776414.7335675, "mtime": 1716968741.377, "ctime": 1716968741.377, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8541 1726776625.84598: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 8541 1726776625.84647: stderr chunk (state=3): >>><<< 8541 1726776625.84654: stdout chunk (state=3): >>><<< 8541 1726776625.84673: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 136, "inode": 917919, "dev": 51713, "nlink": 3, "atime": 1726776414.7335675, "mtime": 1716968741.377, "ctime": 1716968741.377, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.8.186 closed. 8541 1726776625.84709: done with _execute_module (stat, {'path': '/etc/tuned', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776625.5926137-8541-186369952402186/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8541 1726776625.84717: _low_level_execute_command(): starting 8541 1726776625.84725: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776625.5926137-8541-186369952402186/ > /dev/null 2>&1 && sleep 0' 8541 1726776625.87088: stderr chunk (state=2): >>><<< 8541 1726776625.87095: stdout chunk (state=2): >>><<< 8541 1726776625.87108: _low_level_execute_command() done: rc=0, stdout=, stderr= 8541 1726776625.87115: handler run complete 8541 1726776625.87147: attempt loop complete, returning result 8541 1726776625.87163: variable 'item' from source: unknown 8541 1726776625.87224: variable 'item' from source: unknown ok: [managed_node3] => (item=/etc/tuned) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned", "stat": { "atime": 1726776414.7335675, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1716968741.377, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 917919, "isblk": false, "ischr": false, "isdir": true, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/directory", "mode": "0755", "mtime": 1716968741.377, "nlink": 3, "path": "/etc/tuned", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 136, "uid": 0, "version": "1785990601", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 8541 1726776625.87267: dumping result to json 8541 1726776625.87279: done dumping result, returning 8541 1726776625.87287: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory [120fa90a-8a95-c4e4-06a7-00000000002c] 8541 1726776625.87292: sending task result for task 120fa90a-8a95-c4e4-06a7-00000000002c 8541 1726776625.87335: done sending task result for task 120fa90a-8a95-c4e4-06a7-00000000002c 8541 1726776625.87339: WORKER PROCESS EXITING 8283 1726776625.87568: no more pending results, returning what we have 8283 1726776625.87571: results queue empty 8283 1726776625.87571: checking for any_errors_fatal 8283 1726776625.87575: done checking for any_errors_fatal 8283 1726776625.87575: checking for max_fail_percentage 8283 1726776625.87576: done checking for max_fail_percentage 8283 1726776625.87577: checking to see if all hosts have failed and the running result is not ok 8283 1726776625.87577: done checking to see if all hosts have failed 8283 1726776625.87578: getting the remaining hosts for this loop 8283 1726776625.87579: done getting the remaining hosts for this loop 8283 1726776625.87581: getting the next task for host managed_node3 8283 1726776625.87586: done getting next task for host managed_node3 8283 1726776625.87589: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8283 1726776625.87591: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776625.87599: getting variables 8283 1726776625.87600: in VariableManager get_vars() 8283 1726776625.87619: Calling all_inventory to load vars for managed_node3 8283 1726776625.87621: Calling groups_inventory to load vars for managed_node3 8283 1726776625.87622: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776625.87629: Calling all_plugins_play to load vars for managed_node3 8283 1726776625.87632: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776625.87634: Calling groups_plugins_play to load vars for managed_node3 8283 1726776625.87667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776625.87692: done with get_vars() 8283 1726776625.87697: done getting variables 8283 1726776625.87738: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:63 Thursday 19 September 2024 16:10:25 -0400 (0:00:00.699) 0:00:09.582 **** 8283 1726776625.87759: entering _queue_task() for managed_node3/set_fact 8283 1726776625.87912: worker is 1 (out of 1 available) 8283 1726776625.87924: exiting _queue_task() for managed_node3/set_fact 8283 1726776625.87936: done queuing things up, now waiting for results queue to drain 8283 1726776625.87938: waiting for pending results... 8581 1726776625.88046: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8581 1726776625.88143: in run() - task 120fa90a-8a95-c4e4-06a7-00000000002d 8581 1726776625.88157: variable 'ansible_search_path' from source: unknown 8581 1726776625.88162: variable 'ansible_search_path' from source: unknown 8581 1726776625.88191: calling self._execute() 8581 1726776625.88236: variable 'ansible_host' from source: host vars for 'managed_node3' 8581 1726776625.88245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8581 1726776625.88252: variable 'omit' from source: magic vars 8581 1726776625.88322: variable 'omit' from source: magic vars 8581 1726776625.88354: variable 'omit' from source: magic vars 8581 1726776625.88665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8581 1726776625.90153: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8581 1726776625.90200: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8581 1726776625.90229: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8581 1726776625.90386: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8581 1726776625.90407: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8581 1726776625.90462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8581 1726776625.90485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8581 1726776625.90504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8581 1726776625.90535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8581 1726776625.90549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8581 1726776625.90583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8581 1726776625.90600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8581 1726776625.90617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8581 1726776625.90647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8581 1726776625.90659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8581 1726776625.90699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8581 1726776625.90716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8581 1726776625.90736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8581 1726776625.90763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8581 1726776625.90778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8581 1726776625.90919: variable '__kernel_settings_find_profile_dirs' from source: set_fact 8581 1726776625.90989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8581 1726776625.91097: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8581 1726776625.91123: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8581 1726776625.91152: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8581 1726776625.91177: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8581 1726776625.91207: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8581 1726776625.91224: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8581 1726776625.91243: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8581 1726776625.91261: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8581 1726776625.91298: variable 'omit' from source: magic vars 8581 1726776625.91319: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8581 1726776625.91341: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8581 1726776625.91355: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8581 1726776625.91368: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8581 1726776625.91380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8581 1726776625.91403: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8581 1726776625.91408: variable 'ansible_host' from source: host vars for 'managed_node3' 8581 1726776625.91412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8581 1726776625.91477: Set connection var ansible_module_compression to ZIP_DEFLATED 8581 1726776625.91485: Set connection var ansible_shell_type to sh 8581 1726776625.91491: Set connection var ansible_timeout to 10 8581 1726776625.91497: Set connection var ansible_connection to ssh 8581 1726776625.91503: Set connection var ansible_pipelining to False 8581 1726776625.91506: Set connection var ansible_shell_executable to /bin/sh 8581 1726776625.91519: variable 'ansible_shell_executable' from source: unknown 8581 1726776625.91522: variable 'ansible_connection' from source: unknown 8581 1726776625.91525: variable 'ansible_module_compression' from source: unknown 8581 1726776625.91527: variable 'ansible_shell_type' from source: unknown 8581 1726776625.91546: variable 'ansible_shell_executable' from source: unknown 8581 1726776625.91550: variable 'ansible_host' from source: host vars for 'managed_node3' 8581 1726776625.91555: variable 'ansible_pipelining' from source: unknown 8581 1726776625.91558: variable 'ansible_timeout' from source: unknown 8581 1726776625.91563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8581 1726776625.91620: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8581 1726776625.91632: variable 'omit' from source: magic vars 8581 1726776625.91638: starting attempt loop 8581 1726776625.91642: running the handler 8581 1726776625.91651: handler run complete 8581 1726776625.91658: attempt loop complete, returning result 8581 1726776625.91661: _execute() done 8581 1726776625.91664: dumping result to json 8581 1726776625.91667: done dumping result, returning 8581 1726776625.91675: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir [120fa90a-8a95-c4e4-06a7-00000000002d] 8581 1726776625.91681: sending task result for task 120fa90a-8a95-c4e4-06a7-00000000002d 8581 1726776625.91699: done sending task result for task 120fa90a-8a95-c4e4-06a7-00000000002d 8581 1726776625.91702: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__kernel_settings_profile_parent": "/etc/tuned" }, "changed": false } 8283 1726776625.91809: no more pending results, returning what we have 8283 1726776625.91812: results queue empty 8283 1726776625.91812: checking for any_errors_fatal 8283 1726776625.91819: done checking for any_errors_fatal 8283 1726776625.91819: checking for max_fail_percentage 8283 1726776625.91820: done checking for max_fail_percentage 8283 1726776625.91821: checking to see if all hosts have failed and the running result is not ok 8283 1726776625.91821: done checking to see if all hosts have failed 8283 1726776625.91822: getting the remaining hosts for this loop 8283 1726776625.91823: done getting the remaining hosts for this loop 8283 1726776625.91826: getting the next task for host managed_node3 8283 1726776625.91832: done getting next task for host managed_node3 8283 1726776625.91835: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8283 1726776625.91837: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776625.91845: getting variables 8283 1726776625.91847: in VariableManager get_vars() 8283 1726776625.91876: Calling all_inventory to load vars for managed_node3 8283 1726776625.91878: Calling groups_inventory to load vars for managed_node3 8283 1726776625.91880: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776625.91887: Calling all_plugins_play to load vars for managed_node3 8283 1726776625.91889: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776625.91892: Calling groups_plugins_play to load vars for managed_node3 8283 1726776625.91932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776625.91958: done with get_vars() 8283 1726776625.91963: done getting variables 8283 1726776625.92025: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 Thursday 19 September 2024 16:10:25 -0400 (0:00:00.042) 0:00:09.625 **** 8283 1726776625.92049: entering _queue_task() for managed_node3/service 8283 1726776625.92050: Creating lock for service 8283 1726776625.92216: worker is 1 (out of 1 available) 8283 1726776625.92230: exiting _queue_task() for managed_node3/service 8283 1726776625.92241: done queuing things up, now waiting for results queue to drain 8283 1726776625.92242: waiting for pending results... 8582 1726776625.92343: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8582 1726776625.92436: in run() - task 120fa90a-8a95-c4e4-06a7-00000000002e 8582 1726776625.92449: variable 'ansible_search_path' from source: unknown 8582 1726776625.92453: variable 'ansible_search_path' from source: unknown 8582 1726776625.92486: variable '__kernel_settings_services' from source: include_vars 8582 1726776625.92689: variable '__kernel_settings_services' from source: include_vars 8582 1726776625.92742: variable 'omit' from source: magic vars 8582 1726776625.92809: variable 'ansible_host' from source: host vars for 'managed_node3' 8582 1726776625.92819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8582 1726776625.92827: variable 'omit' from source: magic vars 8582 1726776625.92879: variable 'omit' from source: magic vars 8582 1726776625.92909: variable 'omit' from source: magic vars 8582 1726776625.92940: variable 'item' from source: unknown 8582 1726776625.92993: variable 'item' from source: unknown 8582 1726776625.93012: variable 'omit' from source: magic vars 8582 1726776625.93047: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8582 1726776625.93073: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8582 1726776625.93088: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8582 1726776625.93132: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8582 1726776625.93143: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8582 1726776625.93164: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8582 1726776625.93169: variable 'ansible_host' from source: host vars for 'managed_node3' 8582 1726776625.93176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8582 1726776625.93256: Set connection var ansible_module_compression to ZIP_DEFLATED 8582 1726776625.93264: Set connection var ansible_shell_type to sh 8582 1726776625.93272: Set connection var ansible_timeout to 10 8582 1726776625.93278: Set connection var ansible_connection to ssh 8582 1726776625.93285: Set connection var ansible_pipelining to False 8582 1726776625.93290: Set connection var ansible_shell_executable to /bin/sh 8582 1726776625.93304: variable 'ansible_shell_executable' from source: unknown 8582 1726776625.93309: variable 'ansible_connection' from source: unknown 8582 1726776625.93312: variable 'ansible_module_compression' from source: unknown 8582 1726776625.93316: variable 'ansible_shell_type' from source: unknown 8582 1726776625.93319: variable 'ansible_shell_executable' from source: unknown 8582 1726776625.93322: variable 'ansible_host' from source: host vars for 'managed_node3' 8582 1726776625.93327: variable 'ansible_pipelining' from source: unknown 8582 1726776625.93396: variable 'ansible_timeout' from source: unknown 8582 1726776625.93401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8582 1726776625.93513: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8582 1726776625.93524: variable 'omit' from source: magic vars 8582 1726776625.93532: starting attempt loop 8582 1726776625.93535: running the handler 8582 1726776625.93618: variable 'ansible_facts' from source: unknown 8582 1726776625.93655: _low_level_execute_command(): starting 8582 1726776625.93664: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8582 1726776625.96067: stdout chunk (state=2): >>>/root <<< 8582 1726776625.96439: stderr chunk (state=3): >>><<< 8582 1726776625.96446: stdout chunk (state=3): >>><<< 8582 1726776625.96465: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8582 1726776625.96483: _low_level_execute_command(): starting 8582 1726776625.96490: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776625.9647684-8582-63079253201534 `" && echo ansible-tmp-1726776625.9647684-8582-63079253201534="` echo /root/.ansible/tmp/ansible-tmp-1726776625.9647684-8582-63079253201534 `" ) && sleep 0' 8582 1726776625.99086: stdout chunk (state=2): >>>ansible-tmp-1726776625.9647684-8582-63079253201534=/root/.ansible/tmp/ansible-tmp-1726776625.9647684-8582-63079253201534 <<< 8582 1726776625.99208: stderr chunk (state=3): >>><<< 8582 1726776625.99219: stdout chunk (state=3): >>><<< 8582 1726776625.99241: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776625.9647684-8582-63079253201534=/root/.ansible/tmp/ansible-tmp-1726776625.9647684-8582-63079253201534 , stderr= 8582 1726776625.99267: variable 'ansible_module_compression' from source: unknown 8582 1726776625.99305: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 8582 1726776625.99356: variable 'ansible_facts' from source: unknown 8582 1726776625.99507: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776625.9647684-8582-63079253201534/AnsiballZ_setup.py 8582 1726776625.99632: Sending initial data 8582 1726776625.99639: Sent initial data (151 bytes) 8582 1726776626.02209: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmphtz_xjwm /root/.ansible/tmp/ansible-tmp-1726776625.9647684-8582-63079253201534/AnsiballZ_setup.py <<< 8582 1726776626.04972: stderr chunk (state=3): >>><<< 8582 1726776626.04986: stdout chunk (state=3): >>><<< 8582 1726776626.05012: done transferring module to remote 8582 1726776626.05025: _low_level_execute_command(): starting 8582 1726776626.05033: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776625.9647684-8582-63079253201534/ /root/.ansible/tmp/ansible-tmp-1726776625.9647684-8582-63079253201534/AnsiballZ_setup.py && sleep 0' 8582 1726776626.07588: stderr chunk (state=2): >>><<< 8582 1726776626.07598: stdout chunk (state=2): >>><<< 8582 1726776626.07613: _low_level_execute_command() done: rc=0, stdout=, stderr= 8582 1726776626.07617: _low_level_execute_command(): starting 8582 1726776626.07622: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776625.9647684-8582-63079253201534/AnsiballZ_setup.py && sleep 0' 8582 1726776626.35194: stdout chunk (state=2): >>> {"ansible_facts": {"ansible_service_mgr": "systemd"}, "invocation": {"module_args": {"gather_subset": ["!all"], "filter": ["ansible_service_mgr"], "gather_timeout": 10, "fact_path": "/etc/ansible/facts.d"}}} <<< 8582 1726776626.36732: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 8582 1726776626.36777: stderr chunk (state=3): >>><<< 8582 1726776626.36784: stdout chunk (state=3): >>><<< 8582 1726776626.36800: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_service_mgr": "systemd"}, "invocation": {"module_args": {"gather_subset": ["!all"], "filter": ["ansible_service_mgr"], "gather_timeout": 10, "fact_path": "/etc/ansible/facts.d"}}} , stderr=Shared connection to 10.31.8.186 closed. 8582 1726776626.36826: done with _execute_module (ansible.legacy.setup, {'gather_subset': '!all', 'filter': 'ansible_service_mgr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776625.9647684-8582-63079253201534/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8582 1726776626.36847: Facts {'ansible_facts': {'ansible_service_mgr': 'systemd'}, 'invocation': {'module_args': {'gather_subset': ['!all'], 'filter': ['ansible_service_mgr'], 'gather_timeout': 10, 'fact_path': '/etc/ansible/facts.d'}}, '_ansible_parsed': True} 8582 1726776626.36901: variable 'ansible_module_compression' from source: unknown 8582 1726776626.36939: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 8582 1726776626.36943: ANSIBALLZ: Acquiring lock 8582 1726776626.36947: ANSIBALLZ: Lock acquired: 140690877500448 8582 1726776626.36951: ANSIBALLZ: Creating module 8582 1726776626.61144: ANSIBALLZ: Writing module into payload 8582 1726776626.61287: ANSIBALLZ: Writing module 8582 1726776626.61311: ANSIBALLZ: Renaming module 8582 1726776626.61319: ANSIBALLZ: Done creating module 8582 1726776626.61345: variable 'ansible_facts' from source: unknown 8582 1726776626.61494: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776625.9647684-8582-63079253201534/AnsiballZ_systemd.py 8582 1726776626.61591: Sending initial data 8582 1726776626.61598: Sent initial data (153 bytes) 8582 1726776626.64116: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpf_c3ru_p /root/.ansible/tmp/ansible-tmp-1726776625.9647684-8582-63079253201534/AnsiballZ_systemd.py <<< 8582 1726776626.65950: stderr chunk (state=3): >>><<< 8582 1726776626.65958: stdout chunk (state=3): >>><<< 8582 1726776626.65979: done transferring module to remote 8582 1726776626.65989: _low_level_execute_command(): starting 8582 1726776626.65994: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776625.9647684-8582-63079253201534/ /root/.ansible/tmp/ansible-tmp-1726776625.9647684-8582-63079253201534/AnsiballZ_systemd.py && sleep 0' 8582 1726776626.68296: stderr chunk (state=2): >>><<< 8582 1726776626.68306: stdout chunk (state=2): >>><<< 8582 1726776626.68321: _low_level_execute_command() done: rc=0, stdout=, stderr= 8582 1726776626.68325: _low_level_execute_command(): starting 8582 1726776626.68332: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776625.9647684-8582-63079253201534/AnsiballZ_systemd.py && sleep 0' 8582 1726776626.96351: stdout chunk (state=2): >>> {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:06:54 EDT", "WatchdogTimestampMonotonic": "23956134", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "677", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:06:53 EDT", "ExecMainStartTimestampMonotonic": "22773431", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "677", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:06:53 EDT] ; stop_time=[n/a] ; pid=677 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18632704", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh":<<< 8582 1726776626.96394: stdout chunk (state=3): >>> "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "sysinit.target system.slice dbus.socket dbus.service", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service tlp.service power-profiles-daemon.service shutdown.target cpupower.service", "Before": "multi-user.target shutdown.target", "After": "system.slice systemd-journald.socket dbus.service network.target sysinit.target basic.target dbus.socket systemd-sysctl.service polkit.service", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:06:54 EDT", "StateChangeTimestampMonotonic": "23956138", "InactiveExitTimestamp": "Thu 2024-09-19 16:06:53 EDT", "InactiveExitTimestampMonotonic": "22773473", "ActiveEnterTimestamp": "Thu 2024-09-19 16:06:54 EDT", "ActiveEnterTimestampMonotonic": "23956138", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:06:53 EDT", "ConditionTimestampMonotonic": "22772029", "AssertTimestamp": "Thu 2024-09-19 16:06:53 EDT", "AssertTimestampMonotonic": "22772031", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "91929146ef1a4ea9a56f8b38e1888644", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 8582 1726776626.98048: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 8582 1726776626.98098: stderr chunk (state=3): >>><<< 8582 1726776626.98106: stdout chunk (state=3): >>><<< 8582 1726776626.98126: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:06:54 EDT", "WatchdogTimestampMonotonic": "23956134", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "677", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:06:53 EDT", "ExecMainStartTimestampMonotonic": "22773431", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "677", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:06:53 EDT] ; stop_time=[n/a] ; pid=677 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18632704", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "sysinit.target system.slice dbus.socket dbus.service", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service tlp.service power-profiles-daemon.service shutdown.target cpupower.service", "Before": "multi-user.target shutdown.target", "After": "system.slice systemd-journald.socket dbus.service network.target sysinit.target basic.target dbus.socket systemd-sysctl.service polkit.service", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:06:54 EDT", "StateChangeTimestampMonotonic": "23956138", "InactiveExitTimestamp": "Thu 2024-09-19 16:06:53 EDT", "InactiveExitTimestampMonotonic": "22773473", "ActiveEnterTimestamp": "Thu 2024-09-19 16:06:54 EDT", "ActiveEnterTimestampMonotonic": "23956138", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:06:53 EDT", "ConditionTimestampMonotonic": "22772029", "AssertTimestamp": "Thu 2024-09-19 16:06:53 EDT", "AssertTimestampMonotonic": "22772031", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "91929146ef1a4ea9a56f8b38e1888644", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.8.186 closed. 8582 1726776626.98222: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776625.9647684-8582-63079253201534/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8582 1726776626.98241: _low_level_execute_command(): starting 8582 1726776626.98248: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776625.9647684-8582-63079253201534/ > /dev/null 2>&1 && sleep 0' 8582 1726776627.00657: stderr chunk (state=2): >>><<< 8582 1726776627.00665: stdout chunk (state=2): >>><<< 8582 1726776627.00679: _low_level_execute_command() done: rc=0, stdout=, stderr= 8582 1726776627.00688: handler run complete 8582 1726776627.00721: attempt loop complete, returning result 8582 1726776627.00742: variable 'item' from source: unknown 8582 1726776627.00803: variable 'item' from source: unknown ok: [managed_node3] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 16:06:54 EDT", "ActiveEnterTimestampMonotonic": "23956138", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "system.slice systemd-journald.socket dbus.service network.target sysinit.target basic.target dbus.socket systemd-sysctl.service polkit.service", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 16:06:53 EDT", "AssertTimestampMonotonic": "22772031", "Before": "multi-user.target shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:06:53 EDT", "ConditionTimestampMonotonic": "22772029", "ConfigurationDirectoryMode": "0755", "Conflicts": "auto-cpufreq.service tlp.service power-profiles-daemon.service shutdown.target cpupower.service", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "677", "ExecMainStartTimestamp": "Thu 2024-09-19 16:06:53 EDT", "ExecMainStartTimestampMonotonic": "22773431", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:06:53 EDT] ; stop_time=[n/a] ; pid=677 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Thu 2024-09-19 16:06:53 EDT", "InactiveExitTimestampMonotonic": "22773473", "InvocationID": "91929146ef1a4ea9a56f8b38e1888644", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "677", "MemoryAccounting": "yes", "MemoryCurrent": "18632704", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "sysinit.target system.slice dbus.socket dbus.service", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 16:06:54 EDT", "StateChangeTimestampMonotonic": "23956138", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 16:06:54 EDT", "WatchdogTimestampMonotonic": "23956134", "WatchdogUSec": "0" } } 8582 1726776627.01671: dumping result to json 8582 1726776627.01689: done dumping result, returning 8582 1726776627.01698: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started [120fa90a-8a95-c4e4-06a7-00000000002e] 8582 1726776627.01704: sending task result for task 120fa90a-8a95-c4e4-06a7-00000000002e 8582 1726776627.01795: done sending task result for task 120fa90a-8a95-c4e4-06a7-00000000002e 8582 1726776627.01798: WORKER PROCESS EXITING 8283 1726776627.02600: no more pending results, returning what we have 8283 1726776627.02603: results queue empty 8283 1726776627.02604: checking for any_errors_fatal 8283 1726776627.02607: done checking for any_errors_fatal 8283 1726776627.02608: checking for max_fail_percentage 8283 1726776627.02609: done checking for max_fail_percentage 8283 1726776627.02609: checking to see if all hosts have failed and the running result is not ok 8283 1726776627.02610: done checking to see if all hosts have failed 8283 1726776627.02611: getting the remaining hosts for this loop 8283 1726776627.02612: done getting the remaining hosts for this loop 8283 1726776627.02614: getting the next task for host managed_node3 8283 1726776627.02619: done getting next task for host managed_node3 8283 1726776627.02622: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8283 1726776627.02625: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776627.02635: getting variables 8283 1726776627.02636: in VariableManager get_vars() 8283 1726776627.02661: Calling all_inventory to load vars for managed_node3 8283 1726776627.02663: Calling groups_inventory to load vars for managed_node3 8283 1726776627.02665: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776627.02676: Calling all_plugins_play to load vars for managed_node3 8283 1726776627.02679: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776627.02682: Calling groups_plugins_play to load vars for managed_node3 8283 1726776627.02730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776627.02768: done with get_vars() 8283 1726776627.02778: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74 Thursday 19 September 2024 16:10:27 -0400 (0:00:01.108) 0:00:10.733 **** 8283 1726776627.02866: entering _queue_task() for managed_node3/file 8283 1726776627.03059: worker is 1 (out of 1 available) 8283 1726776627.03074: exiting _queue_task() for managed_node3/file 8283 1726776627.03086: done queuing things up, now waiting for results queue to drain 8283 1726776627.03088: waiting for pending results... 8626 1726776627.03300: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8626 1726776627.03426: in run() - task 120fa90a-8a95-c4e4-06a7-00000000002f 8626 1726776627.03444: variable 'ansible_search_path' from source: unknown 8626 1726776627.03448: variable 'ansible_search_path' from source: unknown 8626 1726776627.03484: calling self._execute() 8626 1726776627.03547: variable 'ansible_host' from source: host vars for 'managed_node3' 8626 1726776627.03558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8626 1726776627.03567: variable 'omit' from source: magic vars 8626 1726776627.03667: variable 'omit' from source: magic vars 8626 1726776627.03717: variable 'omit' from source: magic vars 8626 1726776627.03747: variable '__kernel_settings_profile_dir' from source: role '' all vars 8626 1726776627.04016: variable '__kernel_settings_profile_dir' from source: role '' all vars 8626 1726776627.04114: variable '__kernel_settings_profile_parent' from source: set_fact 8626 1726776627.04122: variable '__kernel_settings_tuned_profile' from source: role '' all vars 8626 1726776627.04218: variable 'omit' from source: magic vars 8626 1726776627.04261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8626 1726776627.04297: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8626 1726776627.04318: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8626 1726776627.04336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8626 1726776627.04350: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8626 1726776627.04380: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8626 1726776627.04387: variable 'ansible_host' from source: host vars for 'managed_node3' 8626 1726776627.04391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8626 1726776627.04489: Set connection var ansible_module_compression to ZIP_DEFLATED 8626 1726776627.04497: Set connection var ansible_shell_type to sh 8626 1726776627.04504: Set connection var ansible_timeout to 10 8626 1726776627.04509: Set connection var ansible_connection to ssh 8626 1726776627.04517: Set connection var ansible_pipelining to False 8626 1726776627.04522: Set connection var ansible_shell_executable to /bin/sh 8626 1726776627.04543: variable 'ansible_shell_executable' from source: unknown 8626 1726776627.04548: variable 'ansible_connection' from source: unknown 8626 1726776627.04551: variable 'ansible_module_compression' from source: unknown 8626 1726776627.04555: variable 'ansible_shell_type' from source: unknown 8626 1726776627.04558: variable 'ansible_shell_executable' from source: unknown 8626 1726776627.04560: variable 'ansible_host' from source: host vars for 'managed_node3' 8626 1726776627.04564: variable 'ansible_pipelining' from source: unknown 8626 1726776627.04567: variable 'ansible_timeout' from source: unknown 8626 1726776627.04571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8626 1726776627.04756: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 8626 1726776627.04766: variable 'omit' from source: magic vars 8626 1726776627.04775: starting attempt loop 8626 1726776627.04778: running the handler 8626 1726776627.04790: _low_level_execute_command(): starting 8626 1726776627.04797: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8626 1726776627.07363: stdout chunk (state=2): >>>/root <<< 8626 1726776627.07583: stderr chunk (state=3): >>><<< 8626 1726776627.07590: stdout chunk (state=3): >>><<< 8626 1726776627.07612: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8626 1726776627.07626: _low_level_execute_command(): starting 8626 1726776627.07635: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776627.076199-8626-190666013094479 `" && echo ansible-tmp-1726776627.076199-8626-190666013094479="` echo /root/.ansible/tmp/ansible-tmp-1726776627.076199-8626-190666013094479 `" ) && sleep 0' 8626 1726776627.10308: stdout chunk (state=2): >>>ansible-tmp-1726776627.076199-8626-190666013094479=/root/.ansible/tmp/ansible-tmp-1726776627.076199-8626-190666013094479 <<< 8626 1726776627.10432: stderr chunk (state=3): >>><<< 8626 1726776627.10438: stdout chunk (state=3): >>><<< 8626 1726776627.10450: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776627.076199-8626-190666013094479=/root/.ansible/tmp/ansible-tmp-1726776627.076199-8626-190666013094479 , stderr= 8626 1726776627.10485: variable 'ansible_module_compression' from source: unknown 8626 1726776627.10525: ANSIBALLZ: Using lock for file 8626 1726776627.10532: ANSIBALLZ: Acquiring lock 8626 1726776627.10536: ANSIBALLZ: Lock acquired: 140690877998400 8626 1726776627.10541: ANSIBALLZ: Creating module 8626 1726776627.22406: ANSIBALLZ: Writing module into payload 8626 1726776627.22557: ANSIBALLZ: Writing module 8626 1726776627.22576: ANSIBALLZ: Renaming module 8626 1726776627.22584: ANSIBALLZ: Done creating module 8626 1726776627.22599: variable 'ansible_facts' from source: unknown 8626 1726776627.22657: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776627.076199-8626-190666013094479/AnsiballZ_file.py 8626 1726776627.22756: Sending initial data 8626 1726776627.22763: Sent initial data (150 bytes) 8626 1726776627.25271: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmp8yymq1o0 /root/.ansible/tmp/ansible-tmp-1726776627.076199-8626-190666013094479/AnsiballZ_file.py <<< 8626 1726776627.26287: stderr chunk (state=3): >>><<< 8626 1726776627.26294: stdout chunk (state=3): >>><<< 8626 1726776627.26312: done transferring module to remote 8626 1726776627.26322: _low_level_execute_command(): starting 8626 1726776627.26328: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776627.076199-8626-190666013094479/ /root/.ansible/tmp/ansible-tmp-1726776627.076199-8626-190666013094479/AnsiballZ_file.py && sleep 0' 8626 1726776627.28649: stderr chunk (state=2): >>><<< 8626 1726776627.28656: stdout chunk (state=2): >>><<< 8626 1726776627.28671: _low_level_execute_command() done: rc=0, stdout=, stderr= 8626 1726776627.28676: _low_level_execute_command(): starting 8626 1726776627.28681: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776627.076199-8626-190666013094479/AnsiballZ_file.py && sleep 0' 8626 1726776627.44510: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "absent"}, "after": {"path": "/etc/tuned/kernel_settings", "state": "directory"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 6, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8626 1726776627.45548: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 8626 1726776627.45599: stderr chunk (state=3): >>><<< 8626 1726776627.45606: stdout chunk (state=3): >>><<< 8626 1726776627.45622: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "absent"}, "after": {"path": "/etc/tuned/kernel_settings", "state": "directory"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 6, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.8.186 closed. 8626 1726776627.45657: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'directory', 'mode': '0755', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776627.076199-8626-190666013094479/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8626 1726776627.45667: _low_level_execute_command(): starting 8626 1726776627.45672: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776627.076199-8626-190666013094479/ > /dev/null 2>&1 && sleep 0' 8626 1726776627.48034: stderr chunk (state=2): >>><<< 8626 1726776627.48041: stdout chunk (state=2): >>><<< 8626 1726776627.48054: _low_level_execute_command() done: rc=0, stdout=, stderr= 8626 1726776627.48061: handler run complete 8626 1726776627.48080: attempt loop complete, returning result 8626 1726776627.48084: _execute() done 8626 1726776627.48087: dumping result to json 8626 1726776627.48093: done dumping result, returning 8626 1726776627.48100: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists [120fa90a-8a95-c4e4-06a7-00000000002f] 8626 1726776627.48105: sending task result for task 120fa90a-8a95-c4e4-06a7-00000000002f 8626 1726776627.48139: done sending task result for task 120fa90a-8a95-c4e4-06a7-00000000002f 8626 1726776627.48142: WORKER PROCESS EXITING changed: [managed_node3] => { "changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/tuned/kernel_settings", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 6, "state": "directory", "uid": 0 } 8283 1726776627.48294: no more pending results, returning what we have 8283 1726776627.48297: results queue empty 8283 1726776627.48298: checking for any_errors_fatal 8283 1726776627.48309: done checking for any_errors_fatal 8283 1726776627.48309: checking for max_fail_percentage 8283 1726776627.48311: done checking for max_fail_percentage 8283 1726776627.48311: checking to see if all hosts have failed and the running result is not ok 8283 1726776627.48312: done checking to see if all hosts have failed 8283 1726776627.48312: getting the remaining hosts for this loop 8283 1726776627.48313: done getting the remaining hosts for this loop 8283 1726776627.48316: getting the next task for host managed_node3 8283 1726776627.48321: done getting next task for host managed_node3 8283 1726776627.48326: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8283 1726776627.48328: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776627.48338: getting variables 8283 1726776627.48339: in VariableManager get_vars() 8283 1726776627.48368: Calling all_inventory to load vars for managed_node3 8283 1726776627.48371: Calling groups_inventory to load vars for managed_node3 8283 1726776627.48376: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776627.48384: Calling all_plugins_play to load vars for managed_node3 8283 1726776627.48386: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776627.48388: Calling groups_plugins_play to load vars for managed_node3 8283 1726776627.48435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776627.48466: done with get_vars() 8283 1726776627.48475: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get active_profile] ********** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:80 Thursday 19 September 2024 16:10:27 -0400 (0:00:00.456) 0:00:11.190 **** 8283 1726776627.48539: entering _queue_task() for managed_node3/slurp 8283 1726776627.48699: worker is 1 (out of 1 available) 8283 1726776627.48712: exiting _queue_task() for managed_node3/slurp 8283 1726776627.48723: done queuing things up, now waiting for results queue to drain 8283 1726776627.48725: waiting for pending results... 8647 1726776627.48832: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8647 1726776627.48925: in run() - task 120fa90a-8a95-c4e4-06a7-000000000030 8647 1726776627.48943: variable 'ansible_search_path' from source: unknown 8647 1726776627.48947: variable 'ansible_search_path' from source: unknown 8647 1726776627.48975: calling self._execute() 8647 1726776627.49021: variable 'ansible_host' from source: host vars for 'managed_node3' 8647 1726776627.49031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8647 1726776627.49040: variable 'omit' from source: magic vars 8647 1726776627.49111: variable 'omit' from source: magic vars 8647 1726776627.49146: variable 'omit' from source: magic vars 8647 1726776627.49169: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 8647 1726776627.49370: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 8647 1726776627.49426: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8647 1726776627.49453: variable 'omit' from source: magic vars 8647 1726776627.49483: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8647 1726776627.49509: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8647 1726776627.49523: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8647 1726776627.49539: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8647 1726776627.49552: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8647 1726776627.49575: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8647 1726776627.49580: variable 'ansible_host' from source: host vars for 'managed_node3' 8647 1726776627.49584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8647 1726776627.49650: Set connection var ansible_module_compression to ZIP_DEFLATED 8647 1726776627.49658: Set connection var ansible_shell_type to sh 8647 1726776627.49664: Set connection var ansible_timeout to 10 8647 1726776627.49669: Set connection var ansible_connection to ssh 8647 1726776627.49676: Set connection var ansible_pipelining to False 8647 1726776627.49682: Set connection var ansible_shell_executable to /bin/sh 8647 1726776627.49697: variable 'ansible_shell_executable' from source: unknown 8647 1726776627.49700: variable 'ansible_connection' from source: unknown 8647 1726776627.49704: variable 'ansible_module_compression' from source: unknown 8647 1726776627.49707: variable 'ansible_shell_type' from source: unknown 8647 1726776627.49710: variable 'ansible_shell_executable' from source: unknown 8647 1726776627.49713: variable 'ansible_host' from source: host vars for 'managed_node3' 8647 1726776627.49716: variable 'ansible_pipelining' from source: unknown 8647 1726776627.49718: variable 'ansible_timeout' from source: unknown 8647 1726776627.49721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8647 1726776627.49859: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 8647 1726776627.49869: variable 'omit' from source: magic vars 8647 1726776627.49876: starting attempt loop 8647 1726776627.49880: running the handler 8647 1726776627.49890: _low_level_execute_command(): starting 8647 1726776627.49896: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8647 1726776627.52092: stdout chunk (state=2): >>>/root <<< 8647 1726776627.52197: stderr chunk (state=3): >>><<< 8647 1726776627.52203: stdout chunk (state=3): >>><<< 8647 1726776627.52220: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8647 1726776627.52233: _low_level_execute_command(): starting 8647 1726776627.52239: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776627.5222738-8647-273219428071584 `" && echo ansible-tmp-1726776627.5222738-8647-273219428071584="` echo /root/.ansible/tmp/ansible-tmp-1726776627.5222738-8647-273219428071584 `" ) && sleep 0' 8647 1726776627.54588: stdout chunk (state=2): >>>ansible-tmp-1726776627.5222738-8647-273219428071584=/root/.ansible/tmp/ansible-tmp-1726776627.5222738-8647-273219428071584 <<< 8647 1726776627.54711: stderr chunk (state=3): >>><<< 8647 1726776627.54719: stdout chunk (state=3): >>><<< 8647 1726776627.54733: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776627.5222738-8647-273219428071584=/root/.ansible/tmp/ansible-tmp-1726776627.5222738-8647-273219428071584 , stderr= 8647 1726776627.54764: variable 'ansible_module_compression' from source: unknown 8647 1726776627.54796: ANSIBALLZ: Using lock for slurp 8647 1726776627.54801: ANSIBALLZ: Acquiring lock 8647 1726776627.54805: ANSIBALLZ: Lock acquired: 140690877501456 8647 1726776627.54808: ANSIBALLZ: Creating module 8647 1726776627.63212: ANSIBALLZ: Writing module into payload 8647 1726776627.63265: ANSIBALLZ: Writing module 8647 1726776627.63285: ANSIBALLZ: Renaming module 8647 1726776627.63290: ANSIBALLZ: Done creating module 8647 1726776627.63302: variable 'ansible_facts' from source: unknown 8647 1726776627.63365: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776627.5222738-8647-273219428071584/AnsiballZ_slurp.py 8647 1726776627.63465: Sending initial data 8647 1726776627.63472: Sent initial data (152 bytes) 8647 1726776627.65966: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpvn86dbrj /root/.ansible/tmp/ansible-tmp-1726776627.5222738-8647-273219428071584/AnsiballZ_slurp.py <<< 8647 1726776627.66914: stderr chunk (state=3): >>><<< 8647 1726776627.66921: stdout chunk (state=3): >>><<< 8647 1726776627.66940: done transferring module to remote 8647 1726776627.66951: _low_level_execute_command(): starting 8647 1726776627.66957: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776627.5222738-8647-273219428071584/ /root/.ansible/tmp/ansible-tmp-1726776627.5222738-8647-273219428071584/AnsiballZ_slurp.py && sleep 0' 8647 1726776627.69205: stderr chunk (state=2): >>><<< 8647 1726776627.69212: stdout chunk (state=2): >>><<< 8647 1726776627.69225: _low_level_execute_command() done: rc=0, stdout=, stderr= 8647 1726776627.69230: _low_level_execute_command(): starting 8647 1726776627.69235: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776627.5222738-8647-273219428071584/AnsiballZ_slurp.py && sleep 0' 8647 1726776627.83608: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdAo=", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 8647 1726776627.84581: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 8647 1726776627.84647: stderr chunk (state=3): >>><<< 8647 1726776627.84654: stdout chunk (state=3): >>><<< 8647 1726776627.84669: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdAo=", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.8.186 closed. 8647 1726776627.84706: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776627.5222738-8647-273219428071584/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8647 1726776627.84721: _low_level_execute_command(): starting 8647 1726776627.84727: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776627.5222738-8647-273219428071584/ > /dev/null 2>&1 && sleep 0' 8647 1726776627.87230: stderr chunk (state=2): >>><<< 8647 1726776627.87239: stdout chunk (state=2): >>><<< 8647 1726776627.87253: _low_level_execute_command() done: rc=0, stdout=, stderr= 8647 1726776627.87259: handler run complete 8647 1726776627.87272: attempt loop complete, returning result 8647 1726776627.87277: _execute() done 8647 1726776627.87280: dumping result to json 8647 1726776627.87284: done dumping result, returning 8647 1726776627.87291: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile [120fa90a-8a95-c4e4-06a7-000000000030] 8647 1726776627.87296: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000030 8647 1726776627.87325: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000030 8647 1726776627.87330: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "content": "dmlydHVhbC1ndWVzdAo=", "encoding": "base64", "source": "/etc/tuned/active_profile" } 8283 1726776627.87477: no more pending results, returning what we have 8283 1726776627.87479: results queue empty 8283 1726776627.87480: checking for any_errors_fatal 8283 1726776627.87487: done checking for any_errors_fatal 8283 1726776627.87488: checking for max_fail_percentage 8283 1726776627.87489: done checking for max_fail_percentage 8283 1726776627.87489: checking to see if all hosts have failed and the running result is not ok 8283 1726776627.87490: done checking to see if all hosts have failed 8283 1726776627.87490: getting the remaining hosts for this loop 8283 1726776627.87492: done getting the remaining hosts for this loop 8283 1726776627.87496: getting the next task for host managed_node3 8283 1726776627.87502: done getting next task for host managed_node3 8283 1726776627.87509: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8283 1726776627.87511: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776627.87521: getting variables 8283 1726776627.87522: in VariableManager get_vars() 8283 1726776627.87553: Calling all_inventory to load vars for managed_node3 8283 1726776627.87556: Calling groups_inventory to load vars for managed_node3 8283 1726776627.87558: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776627.87566: Calling all_plugins_play to load vars for managed_node3 8283 1726776627.87568: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776627.87570: Calling groups_plugins_play to load vars for managed_node3 8283 1726776627.87618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776627.87662: done with get_vars() 8283 1726776627.87670: done getting variables 8283 1726776627.87726: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set active_profile] ********** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:85 Thursday 19 September 2024 16:10:27 -0400 (0:00:00.392) 0:00:11.582 **** 8283 1726776627.87758: entering _queue_task() for managed_node3/set_fact 8283 1726776627.87951: worker is 1 (out of 1 available) 8283 1726776627.87963: exiting _queue_task() for managed_node3/set_fact 8283 1726776627.87979: done queuing things up, now waiting for results queue to drain 8283 1726776627.87980: waiting for pending results... 8661 1726776627.88187: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8661 1726776627.88305: in run() - task 120fa90a-8a95-c4e4-06a7-000000000031 8661 1726776627.88323: variable 'ansible_search_path' from source: unknown 8661 1726776627.88330: variable 'ansible_search_path' from source: unknown 8661 1726776627.88363: calling self._execute() 8661 1726776627.88427: variable 'ansible_host' from source: host vars for 'managed_node3' 8661 1726776627.88440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8661 1726776627.88448: variable 'omit' from source: magic vars 8661 1726776627.88545: variable 'omit' from source: magic vars 8661 1726776627.88589: variable 'omit' from source: magic vars 8661 1726776627.88963: variable '__kernel_settings_tuned_profile' from source: role '' all vars 8661 1726776627.88978: variable '__cur_profile' from source: task vars 8661 1726776627.89179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8661 1726776627.91538: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8661 1726776627.91603: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8661 1726776627.91639: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8661 1726776627.91671: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8661 1726776627.91699: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8661 1726776627.91770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8661 1726776627.91801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8661 1726776627.91825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8661 1726776627.91866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8661 1726776627.91885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8661 1726776627.91991: variable '__kernel_settings_tuned_current_profile' from source: set_fact 8661 1726776627.92036: variable 'omit' from source: magic vars 8661 1726776627.92061: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8661 1726776627.92088: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8661 1726776627.92104: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8661 1726776627.92120: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8661 1726776627.92133: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8661 1726776627.92159: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8661 1726776627.92165: variable 'ansible_host' from source: host vars for 'managed_node3' 8661 1726776627.92169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8661 1726776627.92264: Set connection var ansible_module_compression to ZIP_DEFLATED 8661 1726776627.92275: Set connection var ansible_shell_type to sh 8661 1726776627.92282: Set connection var ansible_timeout to 10 8661 1726776627.92287: Set connection var ansible_connection to ssh 8661 1726776627.92295: Set connection var ansible_pipelining to False 8661 1726776627.92301: Set connection var ansible_shell_executable to /bin/sh 8661 1726776627.92320: variable 'ansible_shell_executable' from source: unknown 8661 1726776627.92325: variable 'ansible_connection' from source: unknown 8661 1726776627.92328: variable 'ansible_module_compression' from source: unknown 8661 1726776627.92389: variable 'ansible_shell_type' from source: unknown 8661 1726776627.92393: variable 'ansible_shell_executable' from source: unknown 8661 1726776627.92397: variable 'ansible_host' from source: host vars for 'managed_node3' 8661 1726776627.92401: variable 'ansible_pipelining' from source: unknown 8661 1726776627.92403: variable 'ansible_timeout' from source: unknown 8661 1726776627.92407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8661 1726776627.92495: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8661 1726776627.92506: variable 'omit' from source: magic vars 8661 1726776627.92512: starting attempt loop 8661 1726776627.92515: running the handler 8661 1726776627.92524: handler run complete 8661 1726776627.92534: attempt loop complete, returning result 8661 1726776627.92538: _execute() done 8661 1726776627.92541: dumping result to json 8661 1726776627.92544: done dumping result, returning 8661 1726776627.92550: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile [120fa90a-8a95-c4e4-06a7-000000000031] 8661 1726776627.92555: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000031 8661 1726776627.92582: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000031 8661 1726776627.92585: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__kernel_settings_active_profile": "virtual-guest kernel_settings" }, "changed": false } 8283 1726776627.92969: no more pending results, returning what we have 8283 1726776627.92971: results queue empty 8283 1726776627.92972: checking for any_errors_fatal 8283 1726776627.92979: done checking for any_errors_fatal 8283 1726776627.92980: checking for max_fail_percentage 8283 1726776627.92981: done checking for max_fail_percentage 8283 1726776627.92981: checking to see if all hosts have failed and the running result is not ok 8283 1726776627.92982: done checking to see if all hosts have failed 8283 1726776627.92982: getting the remaining hosts for this loop 8283 1726776627.92983: done getting the remaining hosts for this loop 8283 1726776627.92986: getting the next task for host managed_node3 8283 1726776627.92992: done getting next task for host managed_node3 8283 1726776627.92994: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8283 1726776627.92997: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776627.93010: getting variables 8283 1726776627.93011: in VariableManager get_vars() 8283 1726776627.93042: Calling all_inventory to load vars for managed_node3 8283 1726776627.93044: Calling groups_inventory to load vars for managed_node3 8283 1726776627.93046: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776627.93054: Calling all_plugins_play to load vars for managed_node3 8283 1726776627.93057: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776627.93060: Calling groups_plugins_play to load vars for managed_node3 8283 1726776627.93111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776627.93153: done with get_vars() 8283 1726776627.93161: done getting variables 8283 1726776627.93282: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 Thursday 19 September 2024 16:10:27 -0400 (0:00:00.055) 0:00:11.637 **** 8283 1726776627.93313: entering _queue_task() for managed_node3/copy 8283 1726776627.93509: worker is 1 (out of 1 available) 8283 1726776627.93521: exiting _queue_task() for managed_node3/copy 8283 1726776627.93535: done queuing things up, now waiting for results queue to drain 8283 1726776627.93537: waiting for pending results... 8662 1726776627.93732: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8662 1726776627.93858: in run() - task 120fa90a-8a95-c4e4-06a7-000000000032 8662 1726776627.93878: variable 'ansible_search_path' from source: unknown 8662 1726776627.93884: variable 'ansible_search_path' from source: unknown 8662 1726776627.93914: calling self._execute() 8662 1726776627.93976: variable 'ansible_host' from source: host vars for 'managed_node3' 8662 1726776627.93986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8662 1726776627.93995: variable 'omit' from source: magic vars 8662 1726776627.94093: variable 'omit' from source: magic vars 8662 1726776627.94138: variable 'omit' from source: magic vars 8662 1726776627.94166: variable '__kernel_settings_active_profile' from source: set_fact 8662 1726776627.94440: variable '__kernel_settings_active_profile' from source: set_fact 8662 1726776627.94466: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 8662 1726776627.94539: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 8662 1726776627.94612: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8662 1726776627.94696: variable 'omit' from source: magic vars 8662 1726776627.94743: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8662 1726776627.94780: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8662 1726776627.94800: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8662 1726776627.94888: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8662 1726776627.94902: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8662 1726776627.94932: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8662 1726776627.94939: variable 'ansible_host' from source: host vars for 'managed_node3' 8662 1726776627.94943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8662 1726776627.95045: Set connection var ansible_module_compression to ZIP_DEFLATED 8662 1726776627.95054: Set connection var ansible_shell_type to sh 8662 1726776627.95060: Set connection var ansible_timeout to 10 8662 1726776627.95066: Set connection var ansible_connection to ssh 8662 1726776627.95076: Set connection var ansible_pipelining to False 8662 1726776627.95083: Set connection var ansible_shell_executable to /bin/sh 8662 1726776627.95103: variable 'ansible_shell_executable' from source: unknown 8662 1726776627.95107: variable 'ansible_connection' from source: unknown 8662 1726776627.95111: variable 'ansible_module_compression' from source: unknown 8662 1726776627.95114: variable 'ansible_shell_type' from source: unknown 8662 1726776627.95117: variable 'ansible_shell_executable' from source: unknown 8662 1726776627.95121: variable 'ansible_host' from source: host vars for 'managed_node3' 8662 1726776627.95125: variable 'ansible_pipelining' from source: unknown 8662 1726776627.95128: variable 'ansible_timeout' from source: unknown 8662 1726776627.95134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8662 1726776627.95250: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8662 1726776627.95261: variable 'omit' from source: magic vars 8662 1726776627.95267: starting attempt loop 8662 1726776627.95270: running the handler 8662 1726776627.95282: _low_level_execute_command(): starting 8662 1726776627.95290: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8662 1726776627.98536: stdout chunk (state=2): >>>/root <<< 8662 1726776627.98973: stderr chunk (state=3): >>><<< 8662 1726776627.98981: stdout chunk (state=3): >>><<< 8662 1726776627.99001: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8662 1726776627.99016: _low_level_execute_command(): starting 8662 1726776627.99022: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776627.9900846-8662-82564124147960 `" && echo ansible-tmp-1726776627.9900846-8662-82564124147960="` echo /root/.ansible/tmp/ansible-tmp-1726776627.9900846-8662-82564124147960 `" ) && sleep 0' 8662 1726776628.01902: stdout chunk (state=2): >>>ansible-tmp-1726776627.9900846-8662-82564124147960=/root/.ansible/tmp/ansible-tmp-1726776627.9900846-8662-82564124147960 <<< 8662 1726776628.02034: stderr chunk (state=3): >>><<< 8662 1726776628.02041: stdout chunk (state=3): >>><<< 8662 1726776628.02057: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776627.9900846-8662-82564124147960=/root/.ansible/tmp/ansible-tmp-1726776627.9900846-8662-82564124147960 , stderr= 8662 1726776628.02126: variable 'ansible_module_compression' from source: unknown 8662 1726776628.02168: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8662 1726776628.02198: variable 'ansible_facts' from source: unknown 8662 1726776628.02267: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776627.9900846-8662-82564124147960/AnsiballZ_stat.py 8662 1726776628.02352: Sending initial data 8662 1726776628.02359: Sent initial data (150 bytes) 8662 1726776628.05072: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpfslq9wfw /root/.ansible/tmp/ansible-tmp-1726776627.9900846-8662-82564124147960/AnsiballZ_stat.py <<< 8662 1726776628.06039: stderr chunk (state=3): >>><<< 8662 1726776628.06046: stdout chunk (state=3): >>><<< 8662 1726776628.06065: done transferring module to remote 8662 1726776628.06075: _low_level_execute_command(): starting 8662 1726776628.06081: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776627.9900846-8662-82564124147960/ /root/.ansible/tmp/ansible-tmp-1726776627.9900846-8662-82564124147960/AnsiballZ_stat.py && sleep 0' 8662 1726776628.08572: stderr chunk (state=2): >>><<< 8662 1726776628.08582: stdout chunk (state=2): >>><<< 8662 1726776628.08597: _low_level_execute_command() done: rc=0, stdout=, stderr= 8662 1726776628.08602: _low_level_execute_command(): starting 8662 1726776628.08608: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776627.9900846-8662-82564124147960/AnsiballZ_stat.py && sleep 0' 8662 1726776628.24317: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 14, "inode": 918059, "dev": 51713, "nlink": 1, "atime": 1726776627.8348444, "mtime": 1726776414.9125676, "ctime": 1726776414.9125676, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "mimetype": "text/plain", "charset": "us-ascii", "version": "3521309722", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 8662 1726776628.25409: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 8662 1726776628.25461: stderr chunk (state=3): >>><<< 8662 1726776628.25469: stdout chunk (state=3): >>><<< 8662 1726776628.25486: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 14, "inode": 918059, "dev": 51713, "nlink": 1, "atime": 1726776627.8348444, "mtime": 1726776414.9125676, "ctime": 1726776414.9125676, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "mimetype": "text/plain", "charset": "us-ascii", "version": "3521309722", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.8.186 closed. 8662 1726776628.25553: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776627.9900846-8662-82564124147960/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8662 1726776628.25978: Sending initial data 8662 1726776628.25986: Sent initial data (139 bytes) 8662 1726776628.28288: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpqdvxsljg /root/.ansible/tmp/ansible-tmp-1726776627.9900846-8662-82564124147960/source <<< 8662 1726776628.28920: stderr chunk (state=3): >>><<< 8662 1726776628.28927: stdout chunk (state=3): >>><<< 8662 1726776628.28950: _low_level_execute_command(): starting 8662 1726776628.28957: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776627.9900846-8662-82564124147960/ /root/.ansible/tmp/ansible-tmp-1726776627.9900846-8662-82564124147960/source && sleep 0' 8662 1726776628.31446: stderr chunk (state=2): >>><<< 8662 1726776628.31455: stdout chunk (state=2): >>><<< 8662 1726776628.31471: _low_level_execute_command() done: rc=0, stdout=, stderr= 8662 1726776628.31499: variable 'ansible_module_compression' from source: unknown 8662 1726776628.31550: ANSIBALLZ: Using generic lock for ansible.legacy.copy 8662 1726776628.31555: ANSIBALLZ: Acquiring lock 8662 1726776628.31558: ANSIBALLZ: Lock acquired: 140690877500448 8662 1726776628.31561: ANSIBALLZ: Creating module 8662 1726776628.47488: ANSIBALLZ: Writing module into payload 8662 1726776628.47681: ANSIBALLZ: Writing module 8662 1726776628.47704: ANSIBALLZ: Renaming module 8662 1726776628.47711: ANSIBALLZ: Done creating module 8662 1726776628.47724: variable 'ansible_facts' from source: unknown 8662 1726776628.47800: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776627.9900846-8662-82564124147960/AnsiballZ_copy.py 8662 1726776628.48535: Sending initial data 8662 1726776628.48542: Sent initial data (150 bytes) 8662 1726776628.51630: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmp95qbyzrb /root/.ansible/tmp/ansible-tmp-1726776627.9900846-8662-82564124147960/AnsiballZ_copy.py <<< 8662 1726776628.53011: stderr chunk (state=3): >>><<< 8662 1726776628.53025: stdout chunk (state=3): >>><<< 8662 1726776628.53049: done transferring module to remote 8662 1726776628.53060: _low_level_execute_command(): starting 8662 1726776628.53065: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776627.9900846-8662-82564124147960/ /root/.ansible/tmp/ansible-tmp-1726776627.9900846-8662-82564124147960/AnsiballZ_copy.py && sleep 0' 8662 1726776628.56385: stderr chunk (state=2): >>><<< 8662 1726776628.56396: stdout chunk (state=2): >>><<< 8662 1726776628.56414: _low_level_execute_command() done: rc=0, stdout=, stderr= 8662 1726776628.56419: _low_level_execute_command(): starting 8662 1726776628.56425: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776627.9900846-8662-82564124147960/AnsiballZ_copy.py && sleep 0' 8662 1726776628.73463: stdout chunk (state=2): >>> {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726776627.9900846-8662-82564124147960/source", "md5sum": "8d80fe3f09ba4b9ac8d7fd5e8541a324", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726776627.9900846-8662-82564124147960/source", "_original_basename": "tmpqdvxsljg", "follow": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8662 1726776628.74914: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 8662 1726776628.74956: stderr chunk (state=3): >>><<< 8662 1726776628.74963: stdout chunk (state=3): >>><<< 8662 1726776628.74985: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726776627.9900846-8662-82564124147960/source", "md5sum": "8d80fe3f09ba4b9ac8d7fd5e8541a324", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726776627.9900846-8662-82564124147960/source", "_original_basename": "tmpqdvxsljg", "follow": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.8.186 closed. 8662 1726776628.75023: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/active_profile', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726776627.9900846-8662-82564124147960/source', '_original_basename': 'tmpqdvxsljg', 'follow': False, 'checksum': 'a79569d3860cb6a066e0e92c8b22ffd0e8796bfd', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776627.9900846-8662-82564124147960/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8662 1726776628.75039: _low_level_execute_command(): starting 8662 1726776628.75046: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776627.9900846-8662-82564124147960/ > /dev/null 2>&1 && sleep 0' 8662 1726776628.77643: stderr chunk (state=2): >>><<< 8662 1726776628.77651: stdout chunk (state=2): >>><<< 8662 1726776628.77666: _low_level_execute_command() done: rc=0, stdout=, stderr= 8662 1726776628.77673: handler run complete 8662 1726776628.77699: attempt loop complete, returning result 8662 1726776628.77705: _execute() done 8662 1726776628.77708: dumping result to json 8662 1726776628.77714: done dumping result, returning 8662 1726776628.77721: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile [120fa90a-8a95-c4e4-06a7-000000000032] 8662 1726776628.77726: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000032 8662 1726776628.77770: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000032 8662 1726776628.77774: WORKER PROCESS EXITING changed: [managed_node3] => { "changed": true, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "md5sum": "8d80fe3f09ba4b9ac8d7fd5e8541a324", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "src": "/root/.ansible/tmp/ansible-tmp-1726776627.9900846-8662-82564124147960/source", "state": "file", "uid": 0 } 8283 1726776628.78260: no more pending results, returning what we have 8283 1726776628.78263: results queue empty 8283 1726776628.78263: checking for any_errors_fatal 8283 1726776628.78268: done checking for any_errors_fatal 8283 1726776628.78269: checking for max_fail_percentage 8283 1726776628.78270: done checking for max_fail_percentage 8283 1726776628.78271: checking to see if all hosts have failed and the running result is not ok 8283 1726776628.78272: done checking to see if all hosts have failed 8283 1726776628.78272: getting the remaining hosts for this loop 8283 1726776628.78273: done getting the remaining hosts for this loop 8283 1726776628.78276: getting the next task for host managed_node3 8283 1726776628.78283: done getting next task for host managed_node3 8283 1726776628.78287: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8283 1726776628.78289: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776628.78298: getting variables 8283 1726776628.78300: in VariableManager get_vars() 8283 1726776628.78333: Calling all_inventory to load vars for managed_node3 8283 1726776628.78336: Calling groups_inventory to load vars for managed_node3 8283 1726776628.78338: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776628.78347: Calling all_plugins_play to load vars for managed_node3 8283 1726776628.78349: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776628.78352: Calling groups_plugins_play to load vars for managed_node3 8283 1726776628.78402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776628.78444: done with get_vars() 8283 1726776628.78454: done getting variables 8283 1726776628.78509: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set profile_mode to manual] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 Thursday 19 September 2024 16:10:28 -0400 (0:00:00.852) 0:00:12.490 **** 8283 1726776628.78544: entering _queue_task() for managed_node3/copy 8283 1726776628.78730: worker is 1 (out of 1 available) 8283 1726776628.78743: exiting _queue_task() for managed_node3/copy 8283 1726776628.78754: done queuing things up, now waiting for results queue to drain 8283 1726776628.78755: waiting for pending results... 8714 1726776628.78963: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8714 1726776628.79069: in run() - task 120fa90a-8a95-c4e4-06a7-000000000033 8714 1726776628.79084: variable 'ansible_search_path' from source: unknown 8714 1726776628.79088: variable 'ansible_search_path' from source: unknown 8714 1726776628.79114: calling self._execute() 8714 1726776628.79163: variable 'ansible_host' from source: host vars for 'managed_node3' 8714 1726776628.79172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8714 1726776628.79181: variable 'omit' from source: magic vars 8714 1726776628.79253: variable 'omit' from source: magic vars 8714 1726776628.79287: variable 'omit' from source: magic vars 8714 1726776628.79308: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 8714 1726776628.79569: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 8714 1726776628.79627: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8714 1726776628.79656: variable 'omit' from source: magic vars 8714 1726776628.79688: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8714 1726776628.79716: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8714 1726776628.79735: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8714 1726776628.79749: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8714 1726776628.79761: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8714 1726776628.79785: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8714 1726776628.79791: variable 'ansible_host' from source: host vars for 'managed_node3' 8714 1726776628.79795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8714 1726776628.79864: Set connection var ansible_module_compression to ZIP_DEFLATED 8714 1726776628.79871: Set connection var ansible_shell_type to sh 8714 1726776628.79878: Set connection var ansible_timeout to 10 8714 1726776628.79884: Set connection var ansible_connection to ssh 8714 1726776628.79891: Set connection var ansible_pipelining to False 8714 1726776628.79896: Set connection var ansible_shell_executable to /bin/sh 8714 1726776628.79910: variable 'ansible_shell_executable' from source: unknown 8714 1726776628.79913: variable 'ansible_connection' from source: unknown 8714 1726776628.79916: variable 'ansible_module_compression' from source: unknown 8714 1726776628.79919: variable 'ansible_shell_type' from source: unknown 8714 1726776628.79923: variable 'ansible_shell_executable' from source: unknown 8714 1726776628.79925: variable 'ansible_host' from source: host vars for 'managed_node3' 8714 1726776628.79929: variable 'ansible_pipelining' from source: unknown 8714 1726776628.79932: variable 'ansible_timeout' from source: unknown 8714 1726776628.79934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8714 1726776628.80013: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8714 1726776628.80021: variable 'omit' from source: magic vars 8714 1726776628.80025: starting attempt loop 8714 1726776628.80027: running the handler 8714 1726776628.80037: _low_level_execute_command(): starting 8714 1726776628.80043: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8714 1726776628.82326: stdout chunk (state=2): >>>/root <<< 8714 1726776628.82445: stderr chunk (state=3): >>><<< 8714 1726776628.82452: stdout chunk (state=3): >>><<< 8714 1726776628.82467: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8714 1726776628.82481: _low_level_execute_command(): starting 8714 1726776628.82487: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776628.824764-8714-41995979303314 `" && echo ansible-tmp-1726776628.824764-8714-41995979303314="` echo /root/.ansible/tmp/ansible-tmp-1726776628.824764-8714-41995979303314 `" ) && sleep 0' 8714 1726776628.84889: stdout chunk (state=2): >>>ansible-tmp-1726776628.824764-8714-41995979303314=/root/.ansible/tmp/ansible-tmp-1726776628.824764-8714-41995979303314 <<< 8714 1726776628.85014: stderr chunk (state=3): >>><<< 8714 1726776628.85026: stdout chunk (state=3): >>><<< 8714 1726776628.85042: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776628.824764-8714-41995979303314=/root/.ansible/tmp/ansible-tmp-1726776628.824764-8714-41995979303314 , stderr= 8714 1726776628.85101: variable 'ansible_module_compression' from source: unknown 8714 1726776628.85151: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8714 1726776628.85185: variable 'ansible_facts' from source: unknown 8714 1726776628.85270: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776628.824764-8714-41995979303314/AnsiballZ_stat.py 8714 1726776628.85739: Sending initial data 8714 1726776628.85746: Sent initial data (149 bytes) 8714 1726776628.88249: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmp73pnoi8t /root/.ansible/tmp/ansible-tmp-1726776628.824764-8714-41995979303314/AnsiballZ_stat.py <<< 8714 1726776628.89638: stderr chunk (state=3): >>><<< 8714 1726776628.89646: stdout chunk (state=3): >>><<< 8714 1726776628.89665: done transferring module to remote 8714 1726776628.89679: _low_level_execute_command(): starting 8714 1726776628.89685: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776628.824764-8714-41995979303314/ /root/.ansible/tmp/ansible-tmp-1726776628.824764-8714-41995979303314/AnsiballZ_stat.py && sleep 0' 8714 1726776628.92692: stderr chunk (state=2): >>><<< 8714 1726776628.92701: stdout chunk (state=2): >>><<< 8714 1726776628.92715: _low_level_execute_command() done: rc=0, stdout=, stderr= 8714 1726776628.92719: _low_level_execute_command(): starting 8714 1726776628.92724: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776628.824764-8714-41995979303314/AnsiballZ_stat.py && sleep 0' 8714 1726776629.08686: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 5, "inode": 917923, "dev": 51713, "nlink": 1, "atime": 1726776414.5015676, "mtime": 1726776414.9125676, "ctime": 1726776414.9125676, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "mimetype": "text/plain", "charset": "us-ascii", "version": "3852760320", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 8714 1726776629.10101: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 8714 1726776629.10111: stdout chunk (state=3): >>><<< 8714 1726776629.10121: stderr chunk (state=3): >>><<< 8714 1726776629.10136: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 5, "inode": 917923, "dev": 51713, "nlink": 1, "atime": 1726776414.5015676, "mtime": 1726776414.9125676, "ctime": 1726776414.9125676, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "mimetype": "text/plain", "charset": "us-ascii", "version": "3852760320", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.8.186 closed. 8714 1726776629.10203: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776628.824764-8714-41995979303314/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8714 1726776629.10605: Sending initial data 8714 1726776629.10612: Sent initial data (138 bytes) 8714 1726776629.13435: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmp0drunr37 /root/.ansible/tmp/ansible-tmp-1726776628.824764-8714-41995979303314/source <<< 8714 1726776629.13784: stderr chunk (state=3): >>><<< 8714 1726776629.13793: stdout chunk (state=3): >>><<< 8714 1726776629.13816: _low_level_execute_command(): starting 8714 1726776629.13823: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776628.824764-8714-41995979303314/ /root/.ansible/tmp/ansible-tmp-1726776628.824764-8714-41995979303314/source && sleep 0' 8714 1726776629.16289: stderr chunk (state=2): >>><<< 8714 1726776629.16299: stdout chunk (state=2): >>><<< 8714 1726776629.16315: _low_level_execute_command() done: rc=0, stdout=, stderr= 8714 1726776629.16340: variable 'ansible_module_compression' from source: unknown 8714 1726776629.16382: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 8714 1726776629.16403: variable 'ansible_facts' from source: unknown 8714 1726776629.16470: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776628.824764-8714-41995979303314/AnsiballZ_copy.py 8714 1726776629.16917: Sending initial data 8714 1726776629.16925: Sent initial data (149 bytes) 8714 1726776629.19331: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpfdtukfpx /root/.ansible/tmp/ansible-tmp-1726776628.824764-8714-41995979303314/AnsiballZ_copy.py <<< 8714 1726776629.20553: stderr chunk (state=3): >>><<< 8714 1726776629.20564: stdout chunk (state=3): >>><<< 8714 1726776629.20589: done transferring module to remote 8714 1726776629.20600: _low_level_execute_command(): starting 8714 1726776629.20605: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776628.824764-8714-41995979303314/ /root/.ansible/tmp/ansible-tmp-1726776628.824764-8714-41995979303314/AnsiballZ_copy.py && sleep 0' 8714 1726776629.23515: stderr chunk (state=2): >>><<< 8714 1726776629.23525: stdout chunk (state=2): >>><<< 8714 1726776629.23543: _low_level_execute_command() done: rc=0, stdout=, stderr= 8714 1726776629.23549: _low_level_execute_command(): starting 8714 1726776629.23554: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776628.824764-8714-41995979303314/AnsiballZ_copy.py && sleep 0' 8714 1726776629.40185: stdout chunk (state=2): >>> {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726776628.824764-8714-41995979303314/source", "md5sum": "cf3f2a865fbea819dadd439586eaee31", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726776628.824764-8714-41995979303314/source", "_original_basename": "tmp0drunr37", "follow": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8714 1726776629.41410: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 8714 1726776629.41499: stderr chunk (state=3): >>><<< 8714 1726776629.41506: stdout chunk (state=3): >>><<< 8714 1726776629.41523: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726776628.824764-8714-41995979303314/source", "md5sum": "cf3f2a865fbea819dadd439586eaee31", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726776628.824764-8714-41995979303314/source", "_original_basename": "tmp0drunr37", "follow": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.8.186 closed. 8714 1726776629.41559: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/profile_mode', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726776628.824764-8714-41995979303314/source', '_original_basename': 'tmp0drunr37', 'follow': False, 'checksum': '3ef9f23deed2e23d3ef2b88b842fb882313e15ce', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776628.824764-8714-41995979303314/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8714 1726776629.41571: _low_level_execute_command(): starting 8714 1726776629.41581: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776628.824764-8714-41995979303314/ > /dev/null 2>&1 && sleep 0' 8714 1726776629.44146: stderr chunk (state=2): >>><<< 8714 1726776629.44154: stdout chunk (state=2): >>><<< 8714 1726776629.44167: _low_level_execute_command() done: rc=0, stdout=, stderr= 8714 1726776629.44174: handler run complete 8714 1726776629.44196: attempt loop complete, returning result 8714 1726776629.44200: _execute() done 8714 1726776629.44203: dumping result to json 8714 1726776629.44208: done dumping result, returning 8714 1726776629.44215: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual [120fa90a-8a95-c4e4-06a7-000000000033] 8714 1726776629.44223: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000033 8714 1726776629.44256: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000033 8714 1726776629.44260: WORKER PROCESS EXITING changed: [managed_node3] => { "changed": true, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "md5sum": "cf3f2a865fbea819dadd439586eaee31", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "src": "/root/.ansible/tmp/ansible-tmp-1726776628.824764-8714-41995979303314/source", "state": "file", "uid": 0 } 8283 1726776629.44408: no more pending results, returning what we have 8283 1726776629.44411: results queue empty 8283 1726776629.44411: checking for any_errors_fatal 8283 1726776629.44416: done checking for any_errors_fatal 8283 1726776629.44417: checking for max_fail_percentage 8283 1726776629.44418: done checking for max_fail_percentage 8283 1726776629.44418: checking to see if all hosts have failed and the running result is not ok 8283 1726776629.44419: done checking to see if all hosts have failed 8283 1726776629.44419: getting the remaining hosts for this loop 8283 1726776629.44420: done getting the remaining hosts for this loop 8283 1726776629.44423: getting the next task for host managed_node3 8283 1726776629.44430: done getting next task for host managed_node3 8283 1726776629.44433: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config 8283 1726776629.44435: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776629.44445: getting variables 8283 1726776629.44446: in VariableManager get_vars() 8283 1726776629.44475: Calling all_inventory to load vars for managed_node3 8283 1726776629.44478: Calling groups_inventory to load vars for managed_node3 8283 1726776629.44480: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776629.44487: Calling all_plugins_play to load vars for managed_node3 8283 1726776629.44490: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776629.44492: Calling groups_plugins_play to load vars for managed_node3 8283 1726776629.44543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776629.44575: done with get_vars() 8283 1726776629.44581: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get current config] ********** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:107 Thursday 19 September 2024 16:10:29 -0400 (0:00:00.660) 0:00:13.151 **** 8283 1726776629.44643: entering _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config 8283 1726776629.44836: worker is 1 (out of 1 available) 8283 1726776629.44850: exiting _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config 8283 1726776629.44860: done queuing things up, now waiting for results queue to drain 8283 1726776629.44862: waiting for pending results... 8765 1726776629.44994: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get current config 8765 1726776629.45115: in run() - task 120fa90a-8a95-c4e4-06a7-000000000034 8765 1726776629.45134: variable 'ansible_search_path' from source: unknown 8765 1726776629.45139: variable 'ansible_search_path' from source: unknown 8765 1726776629.45170: calling self._execute() 8765 1726776629.45231: variable 'ansible_host' from source: host vars for 'managed_node3' 8765 1726776629.45241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8765 1726776629.45248: variable 'omit' from source: magic vars 8765 1726776629.45337: variable 'omit' from source: magic vars 8765 1726776629.45383: variable 'omit' from source: magic vars 8765 1726776629.45407: variable '__kernel_settings_profile_filename' from source: role '' all vars 8765 1726776629.45656: variable '__kernel_settings_profile_filename' from source: role '' all vars 8765 1726776629.45736: variable '__kernel_settings_profile_dir' from source: role '' all vars 8765 1726776629.45819: variable '__kernel_settings_profile_parent' from source: set_fact 8765 1726776629.45828: variable '__kernel_settings_tuned_profile' from source: role '' all vars 8765 1726776629.45890: variable 'omit' from source: magic vars 8765 1726776629.45932: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8765 1726776629.45965: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8765 1726776629.45986: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8765 1726776629.46004: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8765 1726776629.46017: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8765 1726776629.46045: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8765 1726776629.46050: variable 'ansible_host' from source: host vars for 'managed_node3' 8765 1726776629.46054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8765 1726776629.46150: Set connection var ansible_module_compression to ZIP_DEFLATED 8765 1726776629.46159: Set connection var ansible_shell_type to sh 8765 1726776629.46165: Set connection var ansible_timeout to 10 8765 1726776629.46170: Set connection var ansible_connection to ssh 8765 1726776629.46180: Set connection var ansible_pipelining to False 8765 1726776629.46185: Set connection var ansible_shell_executable to /bin/sh 8765 1726776629.46204: variable 'ansible_shell_executable' from source: unknown 8765 1726776629.46208: variable 'ansible_connection' from source: unknown 8765 1726776629.46210: variable 'ansible_module_compression' from source: unknown 8765 1726776629.46213: variable 'ansible_shell_type' from source: unknown 8765 1726776629.46216: variable 'ansible_shell_executable' from source: unknown 8765 1726776629.46218: variable 'ansible_host' from source: host vars for 'managed_node3' 8765 1726776629.46222: variable 'ansible_pipelining' from source: unknown 8765 1726776629.46224: variable 'ansible_timeout' from source: unknown 8765 1726776629.46228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8765 1726776629.46393: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 8765 1726776629.46404: variable 'omit' from source: magic vars 8765 1726776629.46410: starting attempt loop 8765 1726776629.46413: running the handler 8765 1726776629.46425: _low_level_execute_command(): starting 8765 1726776629.46434: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8765 1726776629.49136: stdout chunk (state=2): >>>/root <<< 8765 1726776629.49270: stderr chunk (state=3): >>><<< 8765 1726776629.49278: stdout chunk (state=3): >>><<< 8765 1726776629.49301: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8765 1726776629.49315: _low_level_execute_command(): starting 8765 1726776629.49324: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776629.493091-8765-240382010042907 `" && echo ansible-tmp-1726776629.493091-8765-240382010042907="` echo /root/.ansible/tmp/ansible-tmp-1726776629.493091-8765-240382010042907 `" ) && sleep 0' 8765 1726776629.52816: stdout chunk (state=2): >>>ansible-tmp-1726776629.493091-8765-240382010042907=/root/.ansible/tmp/ansible-tmp-1726776629.493091-8765-240382010042907 <<< 8765 1726776629.52965: stderr chunk (state=3): >>><<< 8765 1726776629.52973: stdout chunk (state=3): >>><<< 8765 1726776629.52990: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776629.493091-8765-240382010042907=/root/.ansible/tmp/ansible-tmp-1726776629.493091-8765-240382010042907 , stderr= 8765 1726776629.53038: variable 'ansible_module_compression' from source: unknown 8765 1726776629.53078: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 8765 1726776629.53116: variable 'ansible_facts' from source: unknown 8765 1726776629.53208: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776629.493091-8765-240382010042907/AnsiballZ_kernel_settings_get_config.py 8765 1726776629.53668: Sending initial data 8765 1726776629.53675: Sent initial data (172 bytes) 8765 1726776629.57189: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpajrr7_6m /root/.ansible/tmp/ansible-tmp-1726776629.493091-8765-240382010042907/AnsiballZ_kernel_settings_get_config.py <<< 8765 1726776629.59011: stderr chunk (state=3): >>><<< 8765 1726776629.59022: stdout chunk (state=3): >>><<< 8765 1726776629.59050: done transferring module to remote 8765 1726776629.59063: _low_level_execute_command(): starting 8765 1726776629.59069: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776629.493091-8765-240382010042907/ /root/.ansible/tmp/ansible-tmp-1726776629.493091-8765-240382010042907/AnsiballZ_kernel_settings_get_config.py && sleep 0' 8765 1726776629.61912: stderr chunk (state=2): >>><<< 8765 1726776629.61921: stdout chunk (state=2): >>><<< 8765 1726776629.61938: _low_level_execute_command() done: rc=0, stdout=, stderr= 8765 1726776629.61944: _low_level_execute_command(): starting 8765 1726776629.61951: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776629.493091-8765-240382010042907/AnsiballZ_kernel_settings_get_config.py && sleep 0' 8765 1726776629.77943: stdout chunk (state=2): >>> {"changed": false, "data": {}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} <<< 8765 1726776629.79024: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 8765 1726776629.79073: stderr chunk (state=3): >>><<< 8765 1726776629.79082: stdout chunk (state=3): >>><<< 8765 1726776629.79098: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} , stderr=Shared connection to 10.31.8.186 closed. 8765 1726776629.79117: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776629.493091-8765-240382010042907/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8765 1726776629.79127: _low_level_execute_command(): starting 8765 1726776629.79134: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776629.493091-8765-240382010042907/ > /dev/null 2>&1 && sleep 0' 8765 1726776629.81525: stderr chunk (state=2): >>><<< 8765 1726776629.81534: stdout chunk (state=2): >>><<< 8765 1726776629.81547: _low_level_execute_command() done: rc=0, stdout=, stderr= 8765 1726776629.81554: handler run complete 8765 1726776629.81567: attempt loop complete, returning result 8765 1726776629.81571: _execute() done 8765 1726776629.81575: dumping result to json 8765 1726776629.81581: done dumping result, returning 8765 1726776629.81588: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get current config [120fa90a-8a95-c4e4-06a7-000000000034] 8765 1726776629.81594: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000034 8765 1726776629.81621: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000034 8765 1726776629.81625: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "data": {} } 8283 1726776629.81776: no more pending results, returning what we have 8283 1726776629.81779: results queue empty 8283 1726776629.81779: checking for any_errors_fatal 8283 1726776629.81785: done checking for any_errors_fatal 8283 1726776629.81785: checking for max_fail_percentage 8283 1726776629.81786: done checking for max_fail_percentage 8283 1726776629.81787: checking to see if all hosts have failed and the running result is not ok 8283 1726776629.81787: done checking to see if all hosts have failed 8283 1726776629.81788: getting the remaining hosts for this loop 8283 1726776629.81789: done getting the remaining hosts for this loop 8283 1726776629.81792: getting the next task for host managed_node3 8283 1726776629.81797: done getting next task for host managed_node3 8283 1726776629.81801: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 8283 1726776629.81804: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776629.81812: getting variables 8283 1726776629.81813: in VariableManager get_vars() 8283 1726776629.81844: Calling all_inventory to load vars for managed_node3 8283 1726776629.81847: Calling groups_inventory to load vars for managed_node3 8283 1726776629.81848: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776629.81856: Calling all_plugins_play to load vars for managed_node3 8283 1726776629.81857: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776629.81859: Calling groups_plugins_play to load vars for managed_node3 8283 1726776629.81895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776629.81924: done with get_vars() 8283 1726776629.81932: done getting variables 8283 1726776629.82010: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Apply kernel settings] ******* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 Thursday 19 September 2024 16:10:29 -0400 (0:00:00.373) 0:00:13.525 **** 8283 1726776629.82035: entering _queue_task() for managed_node3/template 8283 1726776629.82036: Creating lock for template 8283 1726776629.82186: worker is 1 (out of 1 available) 8283 1726776629.82197: exiting _queue_task() for managed_node3/template 8283 1726776629.82206: done queuing things up, now waiting for results queue to drain 8283 1726776629.82207: waiting for pending results... 8789 1726776629.82369: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 8789 1726776629.82469: in run() - task 120fa90a-8a95-c4e4-06a7-000000000035 8789 1726776629.82485: variable 'ansible_search_path' from source: unknown 8789 1726776629.82488: variable 'ansible_search_path' from source: unknown 8789 1726776629.82515: calling self._execute() 8789 1726776629.82562: variable 'ansible_host' from source: host vars for 'managed_node3' 8789 1726776629.82568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8789 1726776629.82573: variable 'omit' from source: magic vars 8789 1726776629.82643: variable 'omit' from source: magic vars 8789 1726776629.82672: variable 'omit' from source: magic vars 8789 1726776629.82899: variable '__kernel_settings_profile_src' from source: role '' all vars 8789 1726776629.82906: variable '__kernel_settings_tuned_profile' from source: role '' all vars 8789 1726776629.82960: variable '__kernel_settings_tuned_profile' from source: role '' all vars 8789 1726776629.82981: variable '__kernel_settings_profile_filename' from source: role '' all vars 8789 1726776629.83023: variable '__kernel_settings_profile_filename' from source: role '' all vars 8789 1726776629.83073: variable '__kernel_settings_profile_dir' from source: role '' all vars 8789 1726776629.83131: variable '__kernel_settings_profile_parent' from source: set_fact 8789 1726776629.83137: variable '__kernel_settings_tuned_profile' from source: role '' all vars 8789 1726776629.83160: variable 'omit' from source: magic vars 8789 1726776629.83193: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8789 1726776629.83217: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8789 1726776629.83261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8789 1726776629.83277: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8789 1726776629.83290: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8789 1726776629.83313: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8789 1726776629.83318: variable 'ansible_host' from source: host vars for 'managed_node3' 8789 1726776629.83323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8789 1726776629.83393: Set connection var ansible_module_compression to ZIP_DEFLATED 8789 1726776629.83400: Set connection var ansible_shell_type to sh 8789 1726776629.83405: Set connection var ansible_timeout to 10 8789 1726776629.83408: Set connection var ansible_connection to ssh 8789 1726776629.83412: Set connection var ansible_pipelining to False 8789 1726776629.83415: Set connection var ansible_shell_executable to /bin/sh 8789 1726776629.83431: variable 'ansible_shell_executable' from source: unknown 8789 1726776629.83435: variable 'ansible_connection' from source: unknown 8789 1726776629.83438: variable 'ansible_module_compression' from source: unknown 8789 1726776629.83441: variable 'ansible_shell_type' from source: unknown 8789 1726776629.83444: variable 'ansible_shell_executable' from source: unknown 8789 1726776629.83447: variable 'ansible_host' from source: host vars for 'managed_node3' 8789 1726776629.83451: variable 'ansible_pipelining' from source: unknown 8789 1726776629.83454: variable 'ansible_timeout' from source: unknown 8789 1726776629.83458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8789 1726776629.83549: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8789 1726776629.83557: variable 'omit' from source: magic vars 8789 1726776629.83562: starting attempt loop 8789 1726776629.83564: running the handler 8789 1726776629.83572: _low_level_execute_command(): starting 8789 1726776629.83578: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8789 1726776629.85881: stdout chunk (state=2): >>>/root <<< 8789 1726776629.86000: stderr chunk (state=3): >>><<< 8789 1726776629.86008: stdout chunk (state=3): >>><<< 8789 1726776629.86027: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8789 1726776629.86045: _low_level_execute_command(): starting 8789 1726776629.86052: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776629.8603873-8789-91149281609782 `" && echo ansible-tmp-1726776629.8603873-8789-91149281609782="` echo /root/.ansible/tmp/ansible-tmp-1726776629.8603873-8789-91149281609782 `" ) && sleep 0' 8789 1726776629.88506: stdout chunk (state=2): >>>ansible-tmp-1726776629.8603873-8789-91149281609782=/root/.ansible/tmp/ansible-tmp-1726776629.8603873-8789-91149281609782 <<< 8789 1726776629.88640: stderr chunk (state=3): >>><<< 8789 1726776629.88648: stdout chunk (state=3): >>><<< 8789 1726776629.88663: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776629.8603873-8789-91149281609782=/root/.ansible/tmp/ansible-tmp-1726776629.8603873-8789-91149281609782 , stderr= 8789 1726776629.88679: evaluation_path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks 8789 1726776629.88698: search_path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/templates/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/templates/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/templates/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/kernel_settings.j2 8789 1726776629.88720: variable 'ansible_search_path' from source: unknown 8789 1726776629.89439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8789 1726776629.91192: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8789 1726776629.91245: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8789 1726776629.91282: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8789 1726776629.91309: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8789 1726776629.91333: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8789 1726776629.91514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8789 1726776629.91539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8789 1726776629.91561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8789 1726776629.91589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8789 1726776629.91600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8789 1726776629.91823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8789 1726776629.91844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8789 1726776629.91864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8789 1726776629.91891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8789 1726776629.91903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8789 1726776629.92159: variable 'ansible_managed' from source: unknown 8789 1726776629.92166: variable '__sections' from source: task vars 8789 1726776629.92257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8789 1726776629.92277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8789 1726776629.92297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8789 1726776629.92322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8789 1726776629.92335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8789 1726776629.92407: variable 'kernel_settings_sysctl' from source: role '' defaults 8789 1726776629.92413: variable '__kernel_settings_state_empty' from source: role '' all vars 8789 1726776629.92420: variable '__kernel_settings_previous_replaced' from source: role '' all vars 8789 1726776629.92450: variable '__sysctl_old' from source: task vars 8789 1726776629.92498: variable '__sysctl_old' from source: task vars 8789 1726776629.92642: variable 'kernel_settings_purge' from source: role '' defaults 8789 1726776629.92649: variable 'kernel_settings_sysctl' from source: role '' defaults 8789 1726776629.92654: variable '__kernel_settings_state_empty' from source: role '' all vars 8789 1726776629.92660: variable '__kernel_settings_previous_replaced' from source: role '' all vars 8789 1726776629.92664: variable '__kernel_settings_profile_contents' from source: set_fact 8789 1726776629.92796: variable 'kernel_settings_sysfs' from source: role '' defaults 8789 1726776629.92802: variable '__kernel_settings_state_empty' from source: role '' all vars 8789 1726776629.92808: variable '__kernel_settings_previous_replaced' from source: role '' all vars 8789 1726776629.92821: variable '__sysfs_old' from source: task vars 8789 1726776629.92863: variable '__sysfs_old' from source: task vars 8789 1726776629.93054: variable 'kernel_settings_purge' from source: role '' defaults 8789 1726776629.93060: variable 'kernel_settings_sysfs' from source: role '' defaults 8789 1726776629.93066: variable '__kernel_settings_state_empty' from source: role '' all vars 8789 1726776629.93071: variable '__kernel_settings_previous_replaced' from source: role '' all vars 8789 1726776629.93079: variable '__kernel_settings_profile_contents' from source: set_fact 8789 1726776629.93100: variable 'kernel_settings_systemd_cpu_affinity' from source: role '' defaults 8789 1726776629.93109: variable '__systemd_old' from source: task vars 8789 1726776629.93160: variable '__systemd_old' from source: task vars 8789 1726776629.93308: variable 'kernel_settings_purge' from source: role '' defaults 8789 1726776629.93315: variable 'kernel_settings_systemd_cpu_affinity' from source: role '' defaults 8789 1726776629.93319: variable '__kernel_settings_state_absent' from source: role '' all vars 8789 1726776629.93325: variable '__kernel_settings_profile_contents' from source: set_fact 8789 1726776629.93338: variable 'kernel_settings_transparent_hugepages' from source: role '' defaults 8789 1726776629.93343: variable 'kernel_settings_transparent_hugepages_defrag' from source: role '' defaults 8789 1726776629.93348: variable '__trans_huge_old' from source: task vars 8789 1726776629.93390: variable '__trans_huge_old' from source: task vars 8789 1726776629.93519: variable 'kernel_settings_purge' from source: role '' defaults 8789 1726776629.93526: variable 'kernel_settings_transparent_hugepages' from source: role '' defaults 8789 1726776629.93532: variable '__kernel_settings_state_absent' from source: role '' all vars 8789 1726776629.93538: variable '__kernel_settings_profile_contents' from source: set_fact 8789 1726776629.93547: variable '__trans_defrag_old' from source: task vars 8789 1726776629.93592: variable '__trans_defrag_old' from source: task vars 8789 1726776629.93767: variable 'kernel_settings_purge' from source: role '' defaults 8789 1726776629.93774: variable 'kernel_settings_transparent_hugepages_defrag' from source: role '' defaults 8789 1726776629.93781: variable '__kernel_settings_state_absent' from source: role '' all vars 8789 1726776629.93786: variable '__kernel_settings_profile_contents' from source: set_fact 8789 1726776629.93801: variable '__kernel_settings_state_absent' from source: role '' all vars 8789 1726776629.93813: variable '__kernel_settings_state_absent' from source: role '' all vars 8789 1726776629.93820: variable '__kernel_settings_state_absent' from source: role '' all vars 8789 1726776629.94421: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8789 1726776629.94471: variable 'ansible_module_compression' from source: unknown 8789 1726776629.94522: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8789 1726776629.94554: variable 'ansible_facts' from source: unknown 8789 1726776629.94643: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776629.8603873-8789-91149281609782/AnsiballZ_stat.py 8789 1726776629.94760: Sending initial data 8789 1726776629.94768: Sent initial data (150 bytes) 8789 1726776629.97596: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmp78vn34c5 /root/.ansible/tmp/ansible-tmp-1726776629.8603873-8789-91149281609782/AnsiballZ_stat.py <<< 8789 1726776629.98348: stderr chunk (state=3): >>><<< 8789 1726776629.98357: stdout chunk (state=3): >>><<< 8789 1726776629.98375: done transferring module to remote 8789 1726776629.98388: _low_level_execute_command(): starting 8789 1726776629.98394: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776629.8603873-8789-91149281609782/ /root/.ansible/tmp/ansible-tmp-1726776629.8603873-8789-91149281609782/AnsiballZ_stat.py && sleep 0' 8789 1726776630.00752: stderr chunk (state=2): >>><<< 8789 1726776630.00764: stdout chunk (state=2): >>><<< 8789 1726776630.00778: _low_level_execute_command() done: rc=0, stdout=, stderr= 8789 1726776630.00782: _low_level_execute_command(): starting 8789 1726776630.00785: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776629.8603873-8789-91149281609782/AnsiballZ_stat.py && sleep 0' 8789 1726776630.15854: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 8789 1726776630.16875: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 8789 1726776630.16922: stderr chunk (state=3): >>><<< 8789 1726776630.16931: stdout chunk (state=3): >>><<< 8789 1726776630.16947: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.8.186 closed. 8789 1726776630.16968: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/kernel_settings/tuned.conf', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776629.8603873-8789-91149281609782/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8789 1726776630.17050: Sending initial data 8789 1726776630.17057: Sent initial data (158 bytes) 8789 1726776630.19576: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmptb2elc2s/kernel_settings.j2 /root/.ansible/tmp/ansible-tmp-1726776629.8603873-8789-91149281609782/source <<< 8789 1726776630.19843: stderr chunk (state=3): >>><<< 8789 1726776630.19850: stdout chunk (state=3): >>><<< 8789 1726776630.19866: _low_level_execute_command(): starting 8789 1726776630.19872: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776629.8603873-8789-91149281609782/ /root/.ansible/tmp/ansible-tmp-1726776629.8603873-8789-91149281609782/source && sleep 0' 8789 1726776630.22208: stderr chunk (state=2): >>><<< 8789 1726776630.22216: stdout chunk (state=2): >>><<< 8789 1726776630.22231: _low_level_execute_command() done: rc=0, stdout=, stderr= 8789 1726776630.22251: variable 'ansible_module_compression' from source: unknown 8789 1726776630.22289: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 8789 1726776630.22310: variable 'ansible_facts' from source: unknown 8789 1726776630.22364: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776629.8603873-8789-91149281609782/AnsiballZ_copy.py 8789 1726776630.22479: Sending initial data 8789 1726776630.22487: Sent initial data (150 bytes) 8789 1726776630.25435: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpj61xz9cn /root/.ansible/tmp/ansible-tmp-1726776629.8603873-8789-91149281609782/AnsiballZ_copy.py <<< 8789 1726776630.26335: stderr chunk (state=3): >>><<< 8789 1726776630.26344: stdout chunk (state=3): >>><<< 8789 1726776630.26364: done transferring module to remote 8789 1726776630.26373: _low_level_execute_command(): starting 8789 1726776630.26381: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776629.8603873-8789-91149281609782/ /root/.ansible/tmp/ansible-tmp-1726776629.8603873-8789-91149281609782/AnsiballZ_copy.py && sleep 0' 8789 1726776630.28924: stderr chunk (state=2): >>><<< 8789 1726776630.28934: stdout chunk (state=2): >>><<< 8789 1726776630.28948: _low_level_execute_command() done: rc=0, stdout=, stderr= 8789 1726776630.28953: _low_level_execute_command(): starting 8789 1726776630.28958: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776629.8603873-8789-91149281609782/AnsiballZ_copy.py && sleep 0' 8789 1726776630.45566: stdout chunk (state=2): >>> {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726776629.8603873-8789-91149281609782/source", "md5sum": "7d83891795eeb6debeff7e2812501630", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726776629.8603873-8789-91149281609782/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8789 1726776630.46841: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 8789 1726776630.46853: stdout chunk (state=3): >>><<< 8789 1726776630.46866: stderr chunk (state=3): >>><<< 8789 1726776630.46880: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726776629.8603873-8789-91149281609782/source", "md5sum": "7d83891795eeb6debeff7e2812501630", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726776629.8603873-8789-91149281609782/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.8.186 closed. 8789 1726776630.46919: done with _execute_module (ansible.legacy.copy, {'src': '/root/.ansible/tmp/ansible-tmp-1726776629.8603873-8789-91149281609782/source', 'dest': '/etc/tuned/kernel_settings/tuned.conf', 'mode': '0644', 'follow': False, '_original_basename': 'kernel_settings.j2', 'checksum': 'e44ba7fc7046252a1b6772f7347d0e7b9b48a069', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776629.8603873-8789-91149281609782/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8789 1726776630.46954: _low_level_execute_command(): starting 8789 1726776630.46962: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776629.8603873-8789-91149281609782/ > /dev/null 2>&1 && sleep 0' 8789 1726776630.49573: stderr chunk (state=2): >>><<< 8789 1726776630.49584: stdout chunk (state=2): >>><<< 8789 1726776630.49601: _low_level_execute_command() done: rc=0, stdout=, stderr= 8789 1726776630.49611: handler run complete 8789 1726776630.49642: attempt loop complete, returning result 8789 1726776630.49648: _execute() done 8789 1726776630.49650: dumping result to json 8789 1726776630.49656: done dumping result, returning 8789 1726776630.49663: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings [120fa90a-8a95-c4e4-06a7-000000000035] 8789 1726776630.49668: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000035 8789 1726776630.49726: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000035 8789 1726776630.49734: WORKER PROCESS EXITING changed: [managed_node3] => { "changed": true, "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "dest": "/etc/tuned/kernel_settings/tuned.conf", "gid": 0, "group": "root", "md5sum": "7d83891795eeb6debeff7e2812501630", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "src": "/root/.ansible/tmp/ansible-tmp-1726776629.8603873-8789-91149281609782/source", "state": "file", "uid": 0 } 8283 1726776630.50196: no more pending results, returning what we have 8283 1726776630.50199: results queue empty 8283 1726776630.50200: checking for any_errors_fatal 8283 1726776630.50205: done checking for any_errors_fatal 8283 1726776630.50205: checking for max_fail_percentage 8283 1726776630.50207: done checking for max_fail_percentage 8283 1726776630.50207: checking to see if all hosts have failed and the running result is not ok 8283 1726776630.50208: done checking to see if all hosts have failed 8283 1726776630.50208: getting the remaining hosts for this loop 8283 1726776630.50209: done getting the remaining hosts for this loop 8283 1726776630.50212: getting the next task for host managed_node3 8283 1726776630.50219: done getting next task for host managed_node3 8283 1726776630.50222: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 8283 1726776630.50225: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776630.50235: getting variables 8283 1726776630.50236: in VariableManager get_vars() 8283 1726776630.50268: Calling all_inventory to load vars for managed_node3 8283 1726776630.50271: Calling groups_inventory to load vars for managed_node3 8283 1726776630.50273: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776630.50281: Calling all_plugins_play to load vars for managed_node3 8283 1726776630.50283: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776630.50286: Calling groups_plugins_play to load vars for managed_node3 8283 1726776630.50337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776630.50377: done with get_vars() 8283 1726776630.50385: done getting variables 8283 1726776630.50441: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 Thursday 19 September 2024 16:10:30 -0400 (0:00:00.684) 0:00:14.209 **** 8283 1726776630.50470: entering _queue_task() for managed_node3/service 8283 1726776630.50663: worker is 1 (out of 1 available) 8283 1726776630.50675: exiting _queue_task() for managed_node3/service 8283 1726776630.50686: done queuing things up, now waiting for results queue to drain 8283 1726776630.50688: waiting for pending results... 8838 1726776630.50891: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 8838 1726776630.51013: in run() - task 120fa90a-8a95-c4e4-06a7-000000000036 8838 1726776630.51031: variable 'ansible_search_path' from source: unknown 8838 1726776630.51035: variable 'ansible_search_path' from source: unknown 8838 1726776630.51073: variable '__kernel_settings_services' from source: include_vars 8838 1726776630.51352: variable '__kernel_settings_services' from source: include_vars 8838 1726776630.51416: variable 'omit' from source: magic vars 8838 1726776630.51508: variable 'ansible_host' from source: host vars for 'managed_node3' 8838 1726776630.51520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8838 1726776630.51532: variable 'omit' from source: magic vars 8838 1726776630.52000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8838 1726776630.52227: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8838 1726776630.52272: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8838 1726776630.52303: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8838 1726776630.52336: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8838 1726776630.52428: variable '__kernel_settings_register_profile' from source: set_fact 8838 1726776630.52445: variable '__kernel_settings_register_mode' from source: set_fact 8838 1726776630.52463: Evaluated conditional (__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed): True 8838 1726776630.52470: variable 'omit' from source: magic vars 8838 1726776630.52509: variable 'omit' from source: magic vars 8838 1726776630.52555: variable 'item' from source: unknown 8838 1726776630.52616: variable 'item' from source: unknown 8838 1726776630.52638: variable 'omit' from source: magic vars 8838 1726776630.52668: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8838 1726776630.52695: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8838 1726776630.52713: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8838 1726776630.52732: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8838 1726776630.52743: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8838 1726776630.52770: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8838 1726776630.52775: variable 'ansible_host' from source: host vars for 'managed_node3' 8838 1726776630.52779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8838 1726776630.52919: Set connection var ansible_module_compression to ZIP_DEFLATED 8838 1726776630.52927: Set connection var ansible_shell_type to sh 8838 1726776630.52936: Set connection var ansible_timeout to 10 8838 1726776630.52943: Set connection var ansible_connection to ssh 8838 1726776630.52950: Set connection var ansible_pipelining to False 8838 1726776630.52955: Set connection var ansible_shell_executable to /bin/sh 8838 1726776630.52972: variable 'ansible_shell_executable' from source: unknown 8838 1726776630.52975: variable 'ansible_connection' from source: unknown 8838 1726776630.52978: variable 'ansible_module_compression' from source: unknown 8838 1726776630.52981: variable 'ansible_shell_type' from source: unknown 8838 1726776630.52983: variable 'ansible_shell_executable' from source: unknown 8838 1726776630.52986: variable 'ansible_host' from source: host vars for 'managed_node3' 8838 1726776630.52989: variable 'ansible_pipelining' from source: unknown 8838 1726776630.52991: variable 'ansible_timeout' from source: unknown 8838 1726776630.52995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8838 1726776630.53079: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8838 1726776630.53089: variable 'omit' from source: magic vars 8838 1726776630.53096: starting attempt loop 8838 1726776630.53099: running the handler 8838 1726776630.53174: variable 'ansible_facts' from source: unknown 8838 1726776630.53206: _low_level_execute_command(): starting 8838 1726776630.53215: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8838 1726776630.56330: stdout chunk (state=2): >>>/root <<< 8838 1726776630.56343: stderr chunk (state=2): >>><<< 8838 1726776630.56355: stdout chunk (state=3): >>><<< 8838 1726776630.56372: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8838 1726776630.56387: _low_level_execute_command(): starting 8838 1726776630.56394: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776630.5638077-8838-155387709648883 `" && echo ansible-tmp-1726776630.5638077-8838-155387709648883="` echo /root/.ansible/tmp/ansible-tmp-1726776630.5638077-8838-155387709648883 `" ) && sleep 0' 8838 1726776630.59069: stdout chunk (state=2): >>>ansible-tmp-1726776630.5638077-8838-155387709648883=/root/.ansible/tmp/ansible-tmp-1726776630.5638077-8838-155387709648883 <<< 8838 1726776630.59199: stderr chunk (state=3): >>><<< 8838 1726776630.59206: stdout chunk (state=3): >>><<< 8838 1726776630.59220: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776630.5638077-8838-155387709648883=/root/.ansible/tmp/ansible-tmp-1726776630.5638077-8838-155387709648883 , stderr= 8838 1726776630.59250: variable 'ansible_module_compression' from source: unknown 8838 1726776630.59293: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 8838 1726776630.59353: variable 'ansible_facts' from source: unknown 8838 1726776630.59551: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776630.5638077-8838-155387709648883/AnsiballZ_setup.py 8838 1726776630.59990: Sending initial data 8838 1726776630.59997: Sent initial data (152 bytes) 8838 1726776630.62487: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpwr3g7opm /root/.ansible/tmp/ansible-tmp-1726776630.5638077-8838-155387709648883/AnsiballZ_setup.py <<< 8838 1726776630.64744: stderr chunk (state=3): >>><<< 8838 1726776630.64755: stdout chunk (state=3): >>><<< 8838 1726776630.64774: done transferring module to remote 8838 1726776630.64783: _low_level_execute_command(): starting 8838 1726776630.64787: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776630.5638077-8838-155387709648883/ /root/.ansible/tmp/ansible-tmp-1726776630.5638077-8838-155387709648883/AnsiballZ_setup.py && sleep 0' 8838 1726776630.67150: stderr chunk (state=2): >>><<< 8838 1726776630.67158: stdout chunk (state=2): >>><<< 8838 1726776630.67169: _low_level_execute_command() done: rc=0, stdout=, stderr= 8838 1726776630.67172: _low_level_execute_command(): starting 8838 1726776630.67176: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776630.5638077-8838-155387709648883/AnsiballZ_setup.py && sleep 0' 8838 1726776630.94757: stdout chunk (state=2): >>> {"ansible_facts": {"ansible_service_mgr": "systemd"}, "invocation": {"module_args": {"gather_subset": ["!all"], "filter": ["ansible_service_mgr"], "gather_timeout": 10, "fact_path": "/etc/ansible/facts.d"}}} <<< 8838 1726776630.96408: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 8838 1726776630.96419: stdout chunk (state=3): >>><<< 8838 1726776630.96430: stderr chunk (state=3): >>><<< 8838 1726776630.96442: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_service_mgr": "systemd"}, "invocation": {"module_args": {"gather_subset": ["!all"], "filter": ["ansible_service_mgr"], "gather_timeout": 10, "fact_path": "/etc/ansible/facts.d"}}} , stderr=Shared connection to 10.31.8.186 closed. 8838 1726776630.96466: done with _execute_module (ansible.legacy.setup, {'gather_subset': '!all', 'filter': 'ansible_service_mgr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776630.5638077-8838-155387709648883/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8838 1726776630.96485: Facts {'ansible_facts': {'ansible_service_mgr': 'systemd'}, 'invocation': {'module_args': {'gather_subset': ['!all'], 'filter': ['ansible_service_mgr'], 'gather_timeout': 10, 'fact_path': '/etc/ansible/facts.d'}}, '_ansible_parsed': True} 8838 1726776630.96538: variable 'ansible_module_compression' from source: unknown 8838 1726776630.96570: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 8838 1726776630.96617: variable 'ansible_facts' from source: unknown 8838 1726776630.96770: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776630.5638077-8838-155387709648883/AnsiballZ_systemd.py 8838 1726776630.96862: Sending initial data 8838 1726776630.96870: Sent initial data (154 bytes) 8838 1726776630.99436: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmp7bx69abj /root/.ansible/tmp/ansible-tmp-1726776630.5638077-8838-155387709648883/AnsiballZ_systemd.py <<< 8838 1726776631.02147: stderr chunk (state=3): >>><<< 8838 1726776631.02157: stdout chunk (state=3): >>><<< 8838 1726776631.02189: done transferring module to remote 8838 1726776631.02204: _low_level_execute_command(): starting 8838 1726776631.02210: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776630.5638077-8838-155387709648883/ /root/.ansible/tmp/ansible-tmp-1726776630.5638077-8838-155387709648883/AnsiballZ_systemd.py && sleep 0' 8838 1726776631.04954: stderr chunk (state=2): >>><<< 8838 1726776631.04962: stdout chunk (state=2): >>><<< 8838 1726776631.04976: _low_level_execute_command() done: rc=0, stdout=, stderr= 8838 1726776631.04981: _low_level_execute_command(): starting 8838 1726776631.04988: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776630.5638077-8838-155387709648883/AnsiballZ_systemd.py && sleep 0' 8838 1726776631.58337: stdout chunk (state=2): >>> {"name": "tuned", "changed": true, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:06:54 EDT", "WatchdogTimestampMonotonic": "23956134", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "677", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:06:53 EDT", "ExecMainStartTimestampMonotonic": "22773431", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "677", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:06:53 EDT] ; stop_time=[n/a] ; pid=677 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18624512", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": <<< 8838 1726776631.58357: stdout chunk (state=3): >>>"infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "sysinit.target system.slice dbus.socket dbus.service", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service tlp.service power-profiles-daemon.service shutdown.target cpupower.service", "Before": "multi-user.target shutdown.target", "After": "system.slice systemd-journald.socket dbus.service network.target sysinit.target basic.target dbus.socket systemd-sysctl.service polkit.service", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:06:54 EDT", "StateChangeTimestampMonotonic": "23956138", "InactiveExitTimestamp": "Thu 2024-09-19 16:06:53 EDT", "InactiveExitTimestampMonotonic": "22773473", "ActiveEnterTimestamp": "Thu 2024-09-19 16:06:54 EDT", "ActiveEnterTimestampMonotonic": "23956138", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:06:53 EDT", "ConditionTimestampMonotonic": "22772029", "AssertTimestamp": "Thu 2024-09-19 16:06:53 EDT", "AssertTimestampMonotonic": "22772031", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "91929146ef1a4ea9a56f8b38e1888644", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "restarted", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 8838 1726776631.60147: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 8838 1726776631.60157: stdout chunk (state=3): >>><<< 8838 1726776631.60167: stderr chunk (state=3): >>><<< 8838 1726776631.60189: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": true, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:06:54 EDT", "WatchdogTimestampMonotonic": "23956134", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "677", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:06:53 EDT", "ExecMainStartTimestampMonotonic": "22773431", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "677", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:06:53 EDT] ; stop_time=[n/a] ; pid=677 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18624512", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "sysinit.target system.slice dbus.socket dbus.service", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service tlp.service power-profiles-daemon.service shutdown.target cpupower.service", "Before": "multi-user.target shutdown.target", "After": "system.slice systemd-journald.socket dbus.service network.target sysinit.target basic.target dbus.socket systemd-sysctl.service polkit.service", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:06:54 EDT", "StateChangeTimestampMonotonic": "23956138", "InactiveExitTimestamp": "Thu 2024-09-19 16:06:53 EDT", "InactiveExitTimestampMonotonic": "22773473", "ActiveEnterTimestamp": "Thu 2024-09-19 16:06:54 EDT", "ActiveEnterTimestampMonotonic": "23956138", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:06:53 EDT", "ConditionTimestampMonotonic": "22772029", "AssertTimestamp": "Thu 2024-09-19 16:06:53 EDT", "AssertTimestampMonotonic": "22772031", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "91929146ef1a4ea9a56f8b38e1888644", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "restarted", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.8.186 closed. 8838 1726776631.60369: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'restarted', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776630.5638077-8838-155387709648883/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8838 1726776631.60396: _low_level_execute_command(): starting 8838 1726776631.60405: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776630.5638077-8838-155387709648883/ > /dev/null 2>&1 && sleep 0' 8838 1726776631.63191: stderr chunk (state=2): >>><<< 8838 1726776631.63201: stdout chunk (state=2): >>><<< 8838 1726776631.63220: _low_level_execute_command() done: rc=0, stdout=, stderr= 8838 1726776631.63230: handler run complete 8838 1726776631.63266: attempt loop complete, returning result 8838 1726776631.63287: variable 'item' from source: unknown 8838 1726776631.63358: variable 'item' from source: unknown changed: [managed_node3] => (item=tuned) => { "ansible_loop_var": "item", "changed": true, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 16:06:54 EDT", "ActiveEnterTimestampMonotonic": "23956138", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "system.slice systemd-journald.socket dbus.service network.target sysinit.target basic.target dbus.socket systemd-sysctl.service polkit.service", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 16:06:53 EDT", "AssertTimestampMonotonic": "22772031", "Before": "multi-user.target shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:06:53 EDT", "ConditionTimestampMonotonic": "22772029", "ConfigurationDirectoryMode": "0755", "Conflicts": "auto-cpufreq.service tlp.service power-profiles-daemon.service shutdown.target cpupower.service", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "677", "ExecMainStartTimestamp": "Thu 2024-09-19 16:06:53 EDT", "ExecMainStartTimestampMonotonic": "22773431", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:06:53 EDT] ; stop_time=[n/a] ; pid=677 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Thu 2024-09-19 16:06:53 EDT", "InactiveExitTimestampMonotonic": "22773473", "InvocationID": "91929146ef1a4ea9a56f8b38e1888644", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "677", "MemoryAccounting": "yes", "MemoryCurrent": "18624512", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "sysinit.target system.slice dbus.socket dbus.service", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 16:06:54 EDT", "StateChangeTimestampMonotonic": "23956138", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 16:06:54 EDT", "WatchdogTimestampMonotonic": "23956134", "WatchdogUSec": "0" } } 8838 1726776631.63485: dumping result to json 8838 1726776631.63502: done dumping result, returning 8838 1726776631.63511: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes [120fa90a-8a95-c4e4-06a7-000000000036] 8838 1726776631.63517: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000036 8838 1726776631.63623: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000036 8838 1726776631.63627: WORKER PROCESS EXITING 8283 1726776631.64166: no more pending results, returning what we have 8283 1726776631.64169: results queue empty 8283 1726776631.64169: checking for any_errors_fatal 8283 1726776631.64175: done checking for any_errors_fatal 8283 1726776631.64176: checking for max_fail_percentage 8283 1726776631.64177: done checking for max_fail_percentage 8283 1726776631.64177: checking to see if all hosts have failed and the running result is not ok 8283 1726776631.64180: done checking to see if all hosts have failed 8283 1726776631.64181: getting the remaining hosts for this loop 8283 1726776631.64182: done getting the remaining hosts for this loop 8283 1726776631.64185: getting the next task for host managed_node3 8283 1726776631.64190: done getting next task for host managed_node3 8283 1726776631.64192: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 8283 1726776631.64195: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776631.64204: getting variables 8283 1726776631.64205: in VariableManager get_vars() 8283 1726776631.64230: Calling all_inventory to load vars for managed_node3 8283 1726776631.64233: Calling groups_inventory to load vars for managed_node3 8283 1726776631.64234: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776631.64243: Calling all_plugins_play to load vars for managed_node3 8283 1726776631.64245: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776631.64248: Calling groups_plugins_play to load vars for managed_node3 8283 1726776631.64297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776631.64335: done with get_vars() 8283 1726776631.64343: done getting variables 8283 1726776631.64426: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Tuned apply settings] ******** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 Thursday 19 September 2024 16:10:31 -0400 (0:00:01.139) 0:00:15.349 **** 8283 1726776631.64458: entering _queue_task() for managed_node3/command 8283 1726776631.64460: Creating lock for command 8283 1726776631.64651: worker is 1 (out of 1 available) 8283 1726776631.64666: exiting _queue_task() for managed_node3/command 8283 1726776631.64677: done queuing things up, now waiting for results queue to drain 8283 1726776631.64681: waiting for pending results... 8922 1726776631.64798: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 8922 1726776631.64898: in run() - task 120fa90a-8a95-c4e4-06a7-000000000037 8922 1726776631.64914: variable 'ansible_search_path' from source: unknown 8922 1726776631.64918: variable 'ansible_search_path' from source: unknown 8922 1726776631.64948: calling self._execute() 8922 1726776631.64995: variable 'ansible_host' from source: host vars for 'managed_node3' 8922 1726776631.65004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8922 1726776631.65013: variable 'omit' from source: magic vars 8922 1726776631.65360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8922 1726776631.65562: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8922 1726776631.65596: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8922 1726776631.65621: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8922 1726776631.65649: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8922 1726776631.65735: variable '__kernel_settings_register_profile' from source: set_fact 8922 1726776631.65755: Evaluated conditional (not __kernel_settings_register_profile is changed): False 8922 1726776631.65760: when evaluation is False, skipping this task 8922 1726776631.65763: _execute() done 8922 1726776631.65766: dumping result to json 8922 1726776631.65770: done dumping result, returning 8922 1726776631.65776: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings [120fa90a-8a95-c4e4-06a7-000000000037] 8922 1726776631.65784: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000037 8922 1726776631.65805: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000037 8922 1726776631.65808: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __kernel_settings_register_profile is changed", "skip_reason": "Conditional result was False" } 8283 1726776631.65916: no more pending results, returning what we have 8283 1726776631.65918: results queue empty 8283 1726776631.65919: checking for any_errors_fatal 8283 1726776631.65935: done checking for any_errors_fatal 8283 1726776631.65936: checking for max_fail_percentage 8283 1726776631.65937: done checking for max_fail_percentage 8283 1726776631.65937: checking to see if all hosts have failed and the running result is not ok 8283 1726776631.65938: done checking to see if all hosts have failed 8283 1726776631.65938: getting the remaining hosts for this loop 8283 1726776631.65939: done getting the remaining hosts for this loop 8283 1726776631.65942: getting the next task for host managed_node3 8283 1726776631.65947: done getting next task for host managed_node3 8283 1726776631.65950: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings 8283 1726776631.65953: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776631.65964: getting variables 8283 1726776631.65965: in VariableManager get_vars() 8283 1726776631.65995: Calling all_inventory to load vars for managed_node3 8283 1726776631.65998: Calling groups_inventory to load vars for managed_node3 8283 1726776631.65999: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776631.66005: Calling all_plugins_play to load vars for managed_node3 8283 1726776631.66007: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776631.66009: Calling groups_plugins_play to load vars for managed_node3 8283 1726776631.66046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776631.66073: done with get_vars() 8283 1726776631.66078: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Verify settings] ************* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:166 Thursday 19 September 2024 16:10:31 -0400 (0:00:00.016) 0:00:15.366 **** 8283 1726776631.66144: entering _queue_task() for managed_node3/include_tasks 8283 1726776631.66285: worker is 1 (out of 1 available) 8283 1726776631.66296: exiting _queue_task() for managed_node3/include_tasks 8283 1726776631.66305: done queuing things up, now waiting for results queue to drain 8283 1726776631.66306: waiting for pending results... 8923 1726776631.66468: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Verify settings 8923 1726776631.66564: in run() - task 120fa90a-8a95-c4e4-06a7-000000000038 8923 1726776631.66582: variable 'ansible_search_path' from source: unknown 8923 1726776631.66586: variable 'ansible_search_path' from source: unknown 8923 1726776631.66613: calling self._execute() 8923 1726776631.66660: variable 'ansible_host' from source: host vars for 'managed_node3' 8923 1726776631.66669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8923 1726776631.66677: variable 'omit' from source: magic vars 8923 1726776631.66987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8923 1726776631.67163: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8923 1726776631.67198: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8923 1726776631.67223: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8923 1726776631.67249: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8923 1726776631.67323: variable '__kernel_settings_register_apply' from source: set_fact 8923 1726776631.67396: Evaluated conditional (__kernel_settings_register_apply is changed): True 8923 1726776631.67403: _execute() done 8923 1726776631.67407: dumping result to json 8923 1726776631.67411: done dumping result, returning 8923 1726776631.67416: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Verify settings [120fa90a-8a95-c4e4-06a7-000000000038] 8923 1726776631.67423: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000038 8923 1726776631.67448: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000038 8923 1726776631.67452: WORKER PROCESS EXITING 8283 1726776631.67646: no more pending results, returning what we have 8283 1726776631.67650: in VariableManager get_vars() 8283 1726776631.67685: Calling all_inventory to load vars for managed_node3 8283 1726776631.67688: Calling groups_inventory to load vars for managed_node3 8283 1726776631.67690: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776631.67698: Calling all_plugins_play to load vars for managed_node3 8283 1726776631.67700: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776631.67703: Calling groups_plugins_play to load vars for managed_node3 8283 1726776631.67750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776631.67786: done with get_vars() 8283 1726776631.67791: variable 'ansible_search_path' from source: unknown 8283 1726776631.67792: variable 'ansible_search_path' from source: unknown 8283 1726776631.67824: we have included files to process 8283 1726776631.67825: generating all_blocks data 8283 1726776631.67830: done generating all_blocks data 8283 1726776631.67834: processing included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8283 1726776631.67835: loading included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8283 1726776631.67837: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml included: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml for managed_node3 8283 1726776631.68234: done processing included file 8283 1726776631.68235: iterating over new_blocks loaded from include file 8283 1726776631.68236: in VariableManager get_vars() 8283 1726776631.68251: done with get_vars() 8283 1726776631.68252: filtering new block on tags 8283 1726776631.68267: done filtering new block on tags 8283 1726776631.68268: done iterating over new_blocks loaded from include file 8283 1726776631.68269: extending task lists for all hosts with included blocks 8283 1726776631.68676: done extending task lists 8283 1726776631.68677: done processing included files 8283 1726776631.68677: results queue empty 8283 1726776631.68677: checking for any_errors_fatal 8283 1726776631.68680: done checking for any_errors_fatal 8283 1726776631.68681: checking for max_fail_percentage 8283 1726776631.68681: done checking for max_fail_percentage 8283 1726776631.68682: checking to see if all hosts have failed and the running result is not ok 8283 1726776631.68682: done checking to see if all hosts have failed 8283 1726776631.68682: getting the remaining hosts for this loop 8283 1726776631.68683: done getting the remaining hosts for this loop 8283 1726776631.68684: getting the next task for host managed_node3 8283 1726776631.68687: done getting next task for host managed_node3 8283 1726776631.68689: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 8283 1726776631.68691: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776631.68696: getting variables 8283 1726776631.68697: in VariableManager get_vars() 8283 1726776631.68705: Calling all_inventory to load vars for managed_node3 8283 1726776631.68706: Calling groups_inventory to load vars for managed_node3 8283 1726776631.68707: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776631.68710: Calling all_plugins_play to load vars for managed_node3 8283 1726776631.68712: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776631.68713: Calling groups_plugins_play to load vars for managed_node3 8283 1726776631.68735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776631.68757: done with get_vars() 8283 1726776631.68761: done getting variables 8283 1726776631.68784: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:2 Thursday 19 September 2024 16:10:31 -0400 (0:00:00.026) 0:00:15.392 **** 8283 1726776631.68809: entering _queue_task() for managed_node3/command 8283 1726776631.68954: worker is 1 (out of 1 available) 8283 1726776631.68967: exiting _queue_task() for managed_node3/command 8283 1726776631.68978: done queuing things up, now waiting for results queue to drain 8283 1726776631.68980: waiting for pending results... 8925 1726776631.69086: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 8925 1726776631.69189: in run() - task 120fa90a-8a95-c4e4-06a7-0000000000f5 8925 1726776631.69203: variable 'ansible_search_path' from source: unknown 8925 1726776631.69207: variable 'ansible_search_path' from source: unknown 8925 1726776631.69235: calling self._execute() 8925 1726776631.69350: variable 'ansible_host' from source: host vars for 'managed_node3' 8925 1726776631.69359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8925 1726776631.69367: variable 'omit' from source: magic vars 8925 1726776631.69434: variable 'omit' from source: magic vars 8925 1726776631.69473: variable 'omit' from source: magic vars 8925 1726776631.69496: variable 'omit' from source: magic vars 8925 1726776631.69528: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8925 1726776631.69556: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8925 1726776631.69574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8925 1726776631.69590: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8925 1726776631.69601: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8925 1726776631.69623: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8925 1726776631.69628: variable 'ansible_host' from source: host vars for 'managed_node3' 8925 1726776631.69634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8925 1726776631.69702: Set connection var ansible_module_compression to ZIP_DEFLATED 8925 1726776631.69709: Set connection var ansible_shell_type to sh 8925 1726776631.69716: Set connection var ansible_timeout to 10 8925 1726776631.69721: Set connection var ansible_connection to ssh 8925 1726776631.69728: Set connection var ansible_pipelining to False 8925 1726776631.69735: Set connection var ansible_shell_executable to /bin/sh 8925 1726776631.69751: variable 'ansible_shell_executable' from source: unknown 8925 1726776631.69755: variable 'ansible_connection' from source: unknown 8925 1726776631.69759: variable 'ansible_module_compression' from source: unknown 8925 1726776631.69762: variable 'ansible_shell_type' from source: unknown 8925 1726776631.69765: variable 'ansible_shell_executable' from source: unknown 8925 1726776631.69768: variable 'ansible_host' from source: host vars for 'managed_node3' 8925 1726776631.69772: variable 'ansible_pipelining' from source: unknown 8925 1726776631.69776: variable 'ansible_timeout' from source: unknown 8925 1726776631.69782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8925 1726776631.69866: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8925 1726776631.69878: variable 'omit' from source: magic vars 8925 1726776631.69885: starting attempt loop 8925 1726776631.69889: running the handler 8925 1726776631.69899: _low_level_execute_command(): starting 8925 1726776631.69906: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8925 1726776631.72179: stdout chunk (state=2): >>>/root <<< 8925 1726776631.72343: stderr chunk (state=3): >>><<< 8925 1726776631.72352: stdout chunk (state=3): >>><<< 8925 1726776631.72374: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8925 1726776631.72389: _low_level_execute_command(): starting 8925 1726776631.72395: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776631.7238293-8925-121506398519829 `" && echo ansible-tmp-1726776631.7238293-8925-121506398519829="` echo /root/.ansible/tmp/ansible-tmp-1726776631.7238293-8925-121506398519829 `" ) && sleep 0' 8925 1726776631.74993: stdout chunk (state=2): >>>ansible-tmp-1726776631.7238293-8925-121506398519829=/root/.ansible/tmp/ansible-tmp-1726776631.7238293-8925-121506398519829 <<< 8925 1726776631.75121: stderr chunk (state=3): >>><<< 8925 1726776631.75130: stdout chunk (state=3): >>><<< 8925 1726776631.75145: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776631.7238293-8925-121506398519829=/root/.ansible/tmp/ansible-tmp-1726776631.7238293-8925-121506398519829 , stderr= 8925 1726776631.75171: variable 'ansible_module_compression' from source: unknown 8925 1726776631.75214: ANSIBALLZ: Using generic lock for ansible.legacy.command 8925 1726776631.75219: ANSIBALLZ: Acquiring lock 8925 1726776631.75222: ANSIBALLZ: Lock acquired: 140690877500448 8925 1726776631.75226: ANSIBALLZ: Creating module 8925 1726776631.84635: ANSIBALLZ: Writing module into payload 8925 1726776631.84731: ANSIBALLZ: Writing module 8925 1726776631.84751: ANSIBALLZ: Renaming module 8925 1726776631.84758: ANSIBALLZ: Done creating module 8925 1726776631.84773: variable 'ansible_facts' from source: unknown 8925 1726776631.84831: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776631.7238293-8925-121506398519829/AnsiballZ_command.py 8925 1726776631.84933: Sending initial data 8925 1726776631.84941: Sent initial data (154 bytes) 8925 1726776631.87467: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpgq3giunx /root/.ansible/tmp/ansible-tmp-1726776631.7238293-8925-121506398519829/AnsiballZ_command.py <<< 8925 1726776631.88417: stderr chunk (state=3): >>><<< 8925 1726776631.88423: stdout chunk (state=3): >>><<< 8925 1726776631.88441: done transferring module to remote 8925 1726776631.88451: _low_level_execute_command(): starting 8925 1726776631.88457: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776631.7238293-8925-121506398519829/ /root/.ansible/tmp/ansible-tmp-1726776631.7238293-8925-121506398519829/AnsiballZ_command.py && sleep 0' 8925 1726776631.90770: stderr chunk (state=2): >>><<< 8925 1726776631.90779: stdout chunk (state=2): >>><<< 8925 1726776631.90793: _low_level_execute_command() done: rc=0, stdout=, stderr= 8925 1726776631.90797: _low_level_execute_command(): starting 8925 1726776631.90803: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776631.7238293-8925-121506398519829/AnsiballZ_command.py && sleep 0' 8925 1726776632.18015: stdout chunk (state=2): >>> {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 16:10:32.057067", "end": "2024-09-19 16:10:32.176213", "delta": "0:00:00.119146", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8925 1726776632.20317: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 8925 1726776632.20328: stdout chunk (state=3): >>><<< 8925 1726776632.20340: stderr chunk (state=3): >>><<< 8925 1726776632.20354: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 16:10:32.057067", "end": "2024-09-19 16:10:32.176213", "delta": "0:00:00.119146", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.8.186 closed. 8925 1726776632.20401: done with _execute_module (ansible.legacy.command, {'_raw_params': 'tuned-adm verify -i', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776631.7238293-8925-121506398519829/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8925 1726776632.20410: _low_level_execute_command(): starting 8925 1726776632.20418: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776631.7238293-8925-121506398519829/ > /dev/null 2>&1 && sleep 0' 8925 1726776632.25693: stderr chunk (state=2): >>><<< 8925 1726776632.25704: stdout chunk (state=2): >>><<< 8925 1726776632.25721: _low_level_execute_command() done: rc=0, stdout=, stderr= 8925 1726776632.25731: handler run complete 8925 1726776632.25756: Evaluated conditional (False): False 8925 1726776632.25768: attempt loop complete, returning result 8925 1726776632.25773: _execute() done 8925 1726776632.25776: dumping result to json 8925 1726776632.25782: done dumping result, returning 8925 1726776632.25790: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly [120fa90a-8a95-c4e4-06a7-0000000000f5] 8925 1726776632.25796: sending task result for task 120fa90a-8a95-c4e4-06a7-0000000000f5 8925 1726776632.25837: done sending task result for task 120fa90a-8a95-c4e4-06a7-0000000000f5 8925 1726776632.25841: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "tuned-adm", "verify", "-i" ], "delta": "0:00:00.119146", "end": "2024-09-19 16:10:32.176213", "rc": 0, "start": "2024-09-19 16:10:32.057067" } STDOUT: Verification succeeded, current system settings match the preset profile. See TuneD log file ('/var/log/tuned/tuned.log') for details. 8283 1726776632.26677: no more pending results, returning what we have 8283 1726776632.26679: results queue empty 8283 1726776632.26680: checking for any_errors_fatal 8283 1726776632.26682: done checking for any_errors_fatal 8283 1726776632.26682: checking for max_fail_percentage 8283 1726776632.26683: done checking for max_fail_percentage 8283 1726776632.26684: checking to see if all hosts have failed and the running result is not ok 8283 1726776632.26685: done checking to see if all hosts have failed 8283 1726776632.26685: getting the remaining hosts for this loop 8283 1726776632.26686: done getting the remaining hosts for this loop 8283 1726776632.26689: getting the next task for host managed_node3 8283 1726776632.26694: done getting next task for host managed_node3 8283 1726776632.26698: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 8283 1726776632.26701: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776632.26710: getting variables 8283 1726776632.26711: in VariableManager get_vars() 8283 1726776632.26738: Calling all_inventory to load vars for managed_node3 8283 1726776632.26741: Calling groups_inventory to load vars for managed_node3 8283 1726776632.26743: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776632.26750: Calling all_plugins_play to load vars for managed_node3 8283 1726776632.26753: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776632.26756: Calling groups_plugins_play to load vars for managed_node3 8283 1726776632.26803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776632.26847: done with get_vars() 8283 1726776632.26855: done getting variables 8283 1726776632.26944: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Get last verify results from log] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:12 Thursday 19 September 2024 16:10:32 -0400 (0:00:00.581) 0:00:15.974 **** 8283 1726776632.26977: entering _queue_task() for managed_node3/shell 8283 1726776632.26979: Creating lock for shell 8283 1726776632.27166: worker is 1 (out of 1 available) 8283 1726776632.27178: exiting _queue_task() for managed_node3/shell 8283 1726776632.27189: done queuing things up, now waiting for results queue to drain 8283 1726776632.27191: waiting for pending results... 8971 1726776632.27495: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 8971 1726776632.27636: in run() - task 120fa90a-8a95-c4e4-06a7-0000000000f6 8971 1726776632.27653: variable 'ansible_search_path' from source: unknown 8971 1726776632.27658: variable 'ansible_search_path' from source: unknown 8971 1726776632.27690: calling self._execute() 8971 1726776632.27752: variable 'ansible_host' from source: host vars for 'managed_node3' 8971 1726776632.27762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8971 1726776632.27771: variable 'omit' from source: magic vars 8971 1726776632.28199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8971 1726776632.28468: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8971 1726776632.28513: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8971 1726776632.28546: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8971 1726776632.28577: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8971 1726776632.28675: variable '__kernel_settings_register_verify_values' from source: set_fact 8971 1726776632.28701: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 8971 1726776632.28706: when evaluation is False, skipping this task 8971 1726776632.28710: _execute() done 8971 1726776632.28713: dumping result to json 8971 1726776632.28717: done dumping result, returning 8971 1726776632.28722: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log [120fa90a-8a95-c4e4-06a7-0000000000f6] 8971 1726776632.28730: sending task result for task 120fa90a-8a95-c4e4-06a7-0000000000f6 8971 1726776632.28756: done sending task result for task 120fa90a-8a95-c4e4-06a7-0000000000f6 8971 1726776632.28759: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 8283 1726776632.29073: no more pending results, returning what we have 8283 1726776632.29076: results queue empty 8283 1726776632.29077: checking for any_errors_fatal 8283 1726776632.29087: done checking for any_errors_fatal 8283 1726776632.29088: checking for max_fail_percentage 8283 1726776632.29089: done checking for max_fail_percentage 8283 1726776632.29089: checking to see if all hosts have failed and the running result is not ok 8283 1726776632.29090: done checking to see if all hosts have failed 8283 1726776632.29090: getting the remaining hosts for this loop 8283 1726776632.29091: done getting the remaining hosts for this loop 8283 1726776632.29094: getting the next task for host managed_node3 8283 1726776632.29100: done getting next task for host managed_node3 8283 1726776632.29103: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 8283 1726776632.29106: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776632.29117: getting variables 8283 1726776632.29118: in VariableManager get_vars() 8283 1726776632.29149: Calling all_inventory to load vars for managed_node3 8283 1726776632.29151: Calling groups_inventory to load vars for managed_node3 8283 1726776632.29153: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776632.29161: Calling all_plugins_play to load vars for managed_node3 8283 1726776632.29164: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776632.29166: Calling groups_plugins_play to load vars for managed_node3 8283 1726776632.29213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776632.29256: done with get_vars() 8283 1726776632.29264: done getting variables 8283 1726776632.29318: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:23 Thursday 19 September 2024 16:10:32 -0400 (0:00:00.023) 0:00:15.998 **** 8283 1726776632.29354: entering _queue_task() for managed_node3/fail 8283 1726776632.29521: worker is 1 (out of 1 available) 8283 1726776632.29536: exiting _queue_task() for managed_node3/fail 8283 1726776632.29546: done queuing things up, now waiting for results queue to drain 8283 1726776632.29548: waiting for pending results... 8972 1726776632.30025: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 8972 1726776632.30168: in run() - task 120fa90a-8a95-c4e4-06a7-0000000000f7 8972 1726776632.30186: variable 'ansible_search_path' from source: unknown 8972 1726776632.30191: variable 'ansible_search_path' from source: unknown 8972 1726776632.30220: calling self._execute() 8972 1726776632.30282: variable 'ansible_host' from source: host vars for 'managed_node3' 8972 1726776632.30292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8972 1726776632.30301: variable 'omit' from source: magic vars 8972 1726776632.30718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8972 1726776632.30991: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8972 1726776632.31033: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8972 1726776632.31110: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8972 1726776632.31143: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8972 1726776632.31238: variable '__kernel_settings_register_verify_values' from source: set_fact 8972 1726776632.31261: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 8972 1726776632.31266: when evaluation is False, skipping this task 8972 1726776632.31269: _execute() done 8972 1726776632.31272: dumping result to json 8972 1726776632.31276: done dumping result, returning 8972 1726776632.31283: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors [120fa90a-8a95-c4e4-06a7-0000000000f7] 8972 1726776632.31290: sending task result for task 120fa90a-8a95-c4e4-06a7-0000000000f7 8972 1726776632.31315: done sending task result for task 120fa90a-8a95-c4e4-06a7-0000000000f7 8972 1726776632.31318: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 8283 1726776632.31604: no more pending results, returning what we have 8283 1726776632.31606: results queue empty 8283 1726776632.31607: checking for any_errors_fatal 8283 1726776632.31611: done checking for any_errors_fatal 8283 1726776632.31612: checking for max_fail_percentage 8283 1726776632.31613: done checking for max_fail_percentage 8283 1726776632.31613: checking to see if all hosts have failed and the running result is not ok 8283 1726776632.31614: done checking to see if all hosts have failed 8283 1726776632.31614: getting the remaining hosts for this loop 8283 1726776632.31616: done getting the remaining hosts for this loop 8283 1726776632.31619: getting the next task for host managed_node3 8283 1726776632.31625: done getting next task for host managed_node3 8283 1726776632.31631: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 8283 1726776632.31633: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776632.31645: getting variables 8283 1726776632.31646: in VariableManager get_vars() 8283 1726776632.31675: Calling all_inventory to load vars for managed_node3 8283 1726776632.31678: Calling groups_inventory to load vars for managed_node3 8283 1726776632.31683: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776632.31691: Calling all_plugins_play to load vars for managed_node3 8283 1726776632.31694: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776632.31701: Calling groups_plugins_play to load vars for managed_node3 8283 1726776632.31752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776632.31796: done with get_vars() 8283 1726776632.31804: done getting variables 8283 1726776632.31857: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:177 Thursday 19 September 2024 16:10:32 -0400 (0:00:00.025) 0:00:16.023 **** 8283 1726776632.31888: entering _queue_task() for managed_node3/set_fact 8283 1726776632.32064: worker is 1 (out of 1 available) 8283 1726776632.32078: exiting _queue_task() for managed_node3/set_fact 8283 1726776632.32091: done queuing things up, now waiting for results queue to drain 8283 1726776632.32092: waiting for pending results... 8973 1726776632.32370: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 8973 1726776632.32483: in run() - task 120fa90a-8a95-c4e4-06a7-000000000039 8973 1726776632.32498: variable 'ansible_search_path' from source: unknown 8973 1726776632.32502: variable 'ansible_search_path' from source: unknown 8973 1726776632.32533: calling self._execute() 8973 1726776632.32594: variable 'ansible_host' from source: host vars for 'managed_node3' 8973 1726776632.32603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8973 1726776632.32611: variable 'omit' from source: magic vars 8973 1726776632.32705: variable 'omit' from source: magic vars 8973 1726776632.32746: variable 'omit' from source: magic vars 8973 1726776632.32773: variable 'omit' from source: magic vars 8973 1726776632.32814: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8973 1726776632.32849: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8973 1726776632.32869: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8973 1726776632.32890: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8973 1726776632.32903: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8973 1726776632.32934: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8973 1726776632.32941: variable 'ansible_host' from source: host vars for 'managed_node3' 8973 1726776632.32945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8973 1726776632.33039: Set connection var ansible_module_compression to ZIP_DEFLATED 8973 1726776632.33047: Set connection var ansible_shell_type to sh 8973 1726776632.33053: Set connection var ansible_timeout to 10 8973 1726776632.33057: Set connection var ansible_connection to ssh 8973 1726776632.33063: Set connection var ansible_pipelining to False 8973 1726776632.33068: Set connection var ansible_shell_executable to /bin/sh 8973 1726776632.33087: variable 'ansible_shell_executable' from source: unknown 8973 1726776632.33090: variable 'ansible_connection' from source: unknown 8973 1726776632.33094: variable 'ansible_module_compression' from source: unknown 8973 1726776632.33096: variable 'ansible_shell_type' from source: unknown 8973 1726776632.33099: variable 'ansible_shell_executable' from source: unknown 8973 1726776632.33101: variable 'ansible_host' from source: host vars for 'managed_node3' 8973 1726776632.33105: variable 'ansible_pipelining' from source: unknown 8973 1726776632.33108: variable 'ansible_timeout' from source: unknown 8973 1726776632.33112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8973 1726776632.33644: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8973 1726776632.33657: variable 'omit' from source: magic vars 8973 1726776632.33663: starting attempt loop 8973 1726776632.33665: running the handler 8973 1726776632.33674: handler run complete 8973 1726776632.33684: attempt loop complete, returning result 8973 1726776632.33687: _execute() done 8973 1726776632.33690: dumping result to json 8973 1726776632.33693: done dumping result, returning 8973 1726776632.33699: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes [120fa90a-8a95-c4e4-06a7-000000000039] 8973 1726776632.33705: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000039 8973 1726776632.33732: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000039 8973 1726776632.33737: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "kernel_settings_reboot_required": false }, "changed": false } 8283 1726776632.34090: no more pending results, returning what we have 8283 1726776632.34093: results queue empty 8283 1726776632.34093: checking for any_errors_fatal 8283 1726776632.34099: done checking for any_errors_fatal 8283 1726776632.34099: checking for max_fail_percentage 8283 1726776632.34100: done checking for max_fail_percentage 8283 1726776632.34101: checking to see if all hosts have failed and the running result is not ok 8283 1726776632.34102: done checking to see if all hosts have failed 8283 1726776632.34102: getting the remaining hosts for this loop 8283 1726776632.34103: done getting the remaining hosts for this loop 8283 1726776632.34106: getting the next task for host managed_node3 8283 1726776632.34112: done getting next task for host managed_node3 8283 1726776632.34115: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 8283 1726776632.34118: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776632.34127: getting variables 8283 1726776632.34130: in VariableManager get_vars() 8283 1726776632.34161: Calling all_inventory to load vars for managed_node3 8283 1726776632.34164: Calling groups_inventory to load vars for managed_node3 8283 1726776632.34166: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776632.34174: Calling all_plugins_play to load vars for managed_node3 8283 1726776632.34177: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776632.34182: Calling groups_plugins_play to load vars for managed_node3 8283 1726776632.34223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776632.34267: done with get_vars() 8283 1726776632.34276: done getting variables 8283 1726776632.34334: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:181 Thursday 19 September 2024 16:10:32 -0400 (0:00:00.024) 0:00:16.048 **** 8283 1726776632.34365: entering _queue_task() for managed_node3/set_fact 8283 1726776632.34532: worker is 1 (out of 1 available) 8283 1726776632.34550: exiting _queue_task() for managed_node3/set_fact 8283 1726776632.34563: done queuing things up, now waiting for results queue to drain 8283 1726776632.34565: waiting for pending results... 8978 1726776632.34775: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 8978 1726776632.34902: in run() - task 120fa90a-8a95-c4e4-06a7-00000000003a 8978 1726776632.34919: variable 'ansible_search_path' from source: unknown 8978 1726776632.34923: variable 'ansible_search_path' from source: unknown 8978 1726776632.34957: calling self._execute() 8978 1726776632.35022: variable 'ansible_host' from source: host vars for 'managed_node3' 8978 1726776632.35034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8978 1726776632.35044: variable 'omit' from source: magic vars 8978 1726776632.35138: variable 'omit' from source: magic vars 8978 1726776632.35185: variable 'omit' from source: magic vars 8978 1726776632.35506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8978 1726776632.35715: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8978 1726776632.35760: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8978 1726776632.35791: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8978 1726776632.35823: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8978 1726776632.35952: variable '__kernel_settings_register_profile' from source: set_fact 8978 1726776632.35967: variable '__kernel_settings_register_mode' from source: set_fact 8978 1726776632.35974: variable '__kernel_settings_register_apply' from source: set_fact 8978 1726776632.36020: variable 'omit' from source: magic vars 8978 1726776632.36049: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8978 1726776632.36076: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8978 1726776632.36092: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8978 1726776632.36110: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8978 1726776632.36120: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8978 1726776632.36150: variable 'inventory_hostname' from source: host vars for 'managed_node3' 8978 1726776632.36156: variable 'ansible_host' from source: host vars for 'managed_node3' 8978 1726776632.36160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8978 1726776632.36258: Set connection var ansible_module_compression to ZIP_DEFLATED 8978 1726776632.36268: Set connection var ansible_shell_type to sh 8978 1726776632.36274: Set connection var ansible_timeout to 10 8978 1726776632.36280: Set connection var ansible_connection to ssh 8978 1726776632.36288: Set connection var ansible_pipelining to False 8978 1726776632.36293: Set connection var ansible_shell_executable to /bin/sh 8978 1726776632.36311: variable 'ansible_shell_executable' from source: unknown 8978 1726776632.36316: variable 'ansible_connection' from source: unknown 8978 1726776632.36319: variable 'ansible_module_compression' from source: unknown 8978 1726776632.36322: variable 'ansible_shell_type' from source: unknown 8978 1726776632.36325: variable 'ansible_shell_executable' from source: unknown 8978 1726776632.36328: variable 'ansible_host' from source: host vars for 'managed_node3' 8978 1726776632.36334: variable 'ansible_pipelining' from source: unknown 8978 1726776632.36336: variable 'ansible_timeout' from source: unknown 8978 1726776632.36339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8978 1726776632.36425: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8978 1726776632.36438: variable 'omit' from source: magic vars 8978 1726776632.36444: starting attempt loop 8978 1726776632.36447: running the handler 8978 1726776632.36456: handler run complete 8978 1726776632.36465: attempt loop complete, returning result 8978 1726776632.36468: _execute() done 8978 1726776632.36471: dumping result to json 8978 1726776632.36474: done dumping result, returning 8978 1726776632.36479: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing [120fa90a-8a95-c4e4-06a7-00000000003a] 8978 1726776632.36486: sending task result for task 120fa90a-8a95-c4e4-06a7-00000000003a 8978 1726776632.36507: done sending task result for task 120fa90a-8a95-c4e4-06a7-00000000003a 8978 1726776632.36510: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__kernel_settings_changed": true }, "changed": false } 8283 1726776632.36659: no more pending results, returning what we have 8283 1726776632.36662: results queue empty 8283 1726776632.36662: checking for any_errors_fatal 8283 1726776632.36667: done checking for any_errors_fatal 8283 1726776632.36667: checking for max_fail_percentage 8283 1726776632.36669: done checking for max_fail_percentage 8283 1726776632.36669: checking to see if all hosts have failed and the running result is not ok 8283 1726776632.36670: done checking to see if all hosts have failed 8283 1726776632.36670: getting the remaining hosts for this loop 8283 1726776632.36671: done getting the remaining hosts for this loop 8283 1726776632.36674: getting the next task for host managed_node3 8283 1726776632.36684: done getting next task for host managed_node3 8283 1726776632.36685: ^ task is: TASK: meta (role_complete) 8283 1726776632.36687: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776632.36697: getting variables 8283 1726776632.36698: in VariableManager get_vars() 8283 1726776632.36727: Calling all_inventory to load vars for managed_node3 8283 1726776632.36735: Calling groups_inventory to load vars for managed_node3 8283 1726776632.36737: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776632.36745: Calling all_plugins_play to load vars for managed_node3 8283 1726776632.36748: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776632.36751: Calling groups_plugins_play to load vars for managed_node3 8283 1726776632.36801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776632.36843: done with get_vars() 8283 1726776632.36850: done getting variables 8283 1726776632.36922: done queuing things up, now waiting for results queue to drain 8283 1726776632.36924: results queue empty 8283 1726776632.36924: checking for any_errors_fatal 8283 1726776632.36928: done checking for any_errors_fatal 8283 1726776632.36930: checking for max_fail_percentage 8283 1726776632.36931: done checking for max_fail_percentage 8283 1726776632.36936: checking to see if all hosts have failed and the running result is not ok 8283 1726776632.36937: done checking to see if all hosts have failed 8283 1726776632.36938: getting the remaining hosts for this loop 8283 1726776632.36938: done getting the remaining hosts for this loop 8283 1726776632.36941: getting the next task for host managed_node3 8283 1726776632.36944: done getting next task for host managed_node3 8283 1726776632.36946: ^ task is: TASK: Cleanup 8283 1726776632.36947: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776632.36949: getting variables 8283 1726776632.36950: in VariableManager get_vars() 8283 1726776632.36964: Calling all_inventory to load vars for managed_node3 8283 1726776632.36966: Calling groups_inventory to load vars for managed_node3 8283 1726776632.36968: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776632.36972: Calling all_plugins_play to load vars for managed_node3 8283 1726776632.36974: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776632.36976: Calling groups_plugins_play to load vars for managed_node3 8283 1726776632.37009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776632.37035: done with get_vars() 8283 1726776632.37041: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_default.yml:14 Thursday 19 September 2024 16:10:32 -0400 (0:00:00.027) 0:00:16.076 **** 8283 1726776632.37134: entering _queue_task() for managed_node3/include_tasks 8283 1726776632.37268: worker is 1 (out of 1 available) 8283 1726776632.37286: exiting _queue_task() for managed_node3/include_tasks 8283 1726776632.37295: done queuing things up, now waiting for results queue to drain 8283 1726776632.37296: waiting for pending results... 8996 1726776632.37418: running TaskExecutor() for managed_node3/TASK: Cleanup 8996 1726776632.37518: in run() - task 120fa90a-8a95-c4e4-06a7-000000000007 8996 1726776632.37535: variable 'ansible_search_path' from source: unknown 8996 1726776632.37569: calling self._execute() 8996 1726776632.37640: variable 'ansible_host' from source: host vars for 'managed_node3' 8996 1726776632.37651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 8996 1726776632.37660: variable 'omit' from source: magic vars 8996 1726776632.37754: _execute() done 8996 1726776632.37761: dumping result to json 8996 1726776632.37765: done dumping result, returning 8996 1726776632.37770: done running TaskExecutor() for managed_node3/TASK: Cleanup [120fa90a-8a95-c4e4-06a7-000000000007] 8996 1726776632.37778: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000007 8996 1726776632.37803: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000007 8996 1726776632.37805: WORKER PROCESS EXITING 8283 1726776632.38046: no more pending results, returning what we have 8283 1726776632.38052: in VariableManager get_vars() 8283 1726776632.38087: Calling all_inventory to load vars for managed_node3 8283 1726776632.38090: Calling groups_inventory to load vars for managed_node3 8283 1726776632.38092: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776632.38100: Calling all_plugins_play to load vars for managed_node3 8283 1726776632.38103: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776632.38106: Calling groups_plugins_play to load vars for managed_node3 8283 1726776632.38153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776632.38181: done with get_vars() 8283 1726776632.38185: variable 'ansible_search_path' from source: unknown 8283 1726776632.38197: we have included files to process 8283 1726776632.38198: generating all_blocks data 8283 1726776632.38199: done generating all_blocks data 8283 1726776632.38203: processing included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml 8283 1726776632.38204: loading included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml 8283 1726776632.38206: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml included: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml for managed_node3 8283 1726776632.39118: done processing included file 8283 1726776632.39120: iterating over new_blocks loaded from include file 8283 1726776632.39121: in VariableManager get_vars() 8283 1726776632.39141: done with get_vars() 8283 1726776632.39143: filtering new block on tags 8283 1726776632.39157: done filtering new block on tags 8283 1726776632.39162: in VariableManager get_vars() 8283 1726776632.39195: done with get_vars() 8283 1726776632.39197: filtering new block on tags 8283 1726776632.39212: done filtering new block on tags 8283 1726776632.39213: done iterating over new_blocks loaded from include file 8283 1726776632.39214: extending task lists for all hosts with included blocks 8283 1726776632.39958: done extending task lists 8283 1726776632.39960: done processing included files 8283 1726776632.39961: results queue empty 8283 1726776632.39961: checking for any_errors_fatal 8283 1726776632.39963: done checking for any_errors_fatal 8283 1726776632.39963: checking for max_fail_percentage 8283 1726776632.39964: done checking for max_fail_percentage 8283 1726776632.39965: checking to see if all hosts have failed and the running result is not ok 8283 1726776632.39965: done checking to see if all hosts have failed 8283 1726776632.39966: getting the remaining hosts for this loop 8283 1726776632.39967: done getting the remaining hosts for this loop 8283 1726776632.39969: getting the next task for host managed_node3 8283 1726776632.39974: done getting next task for host managed_node3 8283 1726776632.39976: ^ task is: TASK: Show current tuned profile settings 8283 1726776632.39978: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776632.39980: getting variables 8283 1726776632.39980: in VariableManager get_vars() 8283 1726776632.39993: Calling all_inventory to load vars for managed_node3 8283 1726776632.39995: Calling groups_inventory to load vars for managed_node3 8283 1726776632.39997: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776632.40002: Calling all_plugins_play to load vars for managed_node3 8283 1726776632.40005: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776632.40007: Calling groups_plugins_play to load vars for managed_node3 8283 1726776632.40042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776632.40070: done with get_vars() 8283 1726776632.40077: done getting variables 8283 1726776632.40109: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current tuned profile settings] ************************************* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:2 Thursday 19 September 2024 16:10:32 -0400 (0:00:00.030) 0:00:16.106 **** 8283 1726776632.40138: entering _queue_task() for managed_node3/command 8283 1726776632.40342: worker is 1 (out of 1 available) 8283 1726776632.40356: exiting _queue_task() for managed_node3/command 8283 1726776632.40368: done queuing things up, now waiting for results queue to drain 8283 1726776632.40370: waiting for pending results... 9000 1726776632.40559: running TaskExecutor() for managed_node3/TASK: Show current tuned profile settings 9000 1726776632.40656: in run() - task 120fa90a-8a95-c4e4-06a7-000000000151 9000 1726776632.40670: variable 'ansible_search_path' from source: unknown 9000 1726776632.40674: variable 'ansible_search_path' from source: unknown 9000 1726776632.40701: calling self._execute() 9000 1726776632.40752: variable 'ansible_host' from source: host vars for 'managed_node3' 9000 1726776632.40762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9000 1726776632.40770: variable 'omit' from source: magic vars 9000 1726776632.40841: variable 'omit' from source: magic vars 9000 1726776632.40869: variable 'omit' from source: magic vars 9000 1726776632.41096: variable '__kernel_settings_profile_filename' from source: role '' exported vars 9000 1726776632.41151: variable '__kernel_settings_profile_dir' from source: role '' exported vars 9000 1726776632.41245: variable '__kernel_settings_profile_parent' from source: set_fact 9000 1726776632.41253: variable '__kernel_settings_tuned_profile' from source: role '' exported vars 9000 1726776632.41286: variable 'omit' from source: magic vars 9000 1726776632.41318: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9000 1726776632.41346: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9000 1726776632.41363: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9000 1726776632.41377: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9000 1726776632.41388: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9000 1726776632.41409: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9000 1726776632.41413: variable 'ansible_host' from source: host vars for 'managed_node3' 9000 1726776632.41415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9000 1726776632.41490: Set connection var ansible_module_compression to ZIP_DEFLATED 9000 1726776632.41498: Set connection var ansible_shell_type to sh 9000 1726776632.41504: Set connection var ansible_timeout to 10 9000 1726776632.41510: Set connection var ansible_connection to ssh 9000 1726776632.41517: Set connection var ansible_pipelining to False 9000 1726776632.41522: Set connection var ansible_shell_executable to /bin/sh 9000 1726776632.41538: variable 'ansible_shell_executable' from source: unknown 9000 1726776632.41542: variable 'ansible_connection' from source: unknown 9000 1726776632.41545: variable 'ansible_module_compression' from source: unknown 9000 1726776632.41549: variable 'ansible_shell_type' from source: unknown 9000 1726776632.41552: variable 'ansible_shell_executable' from source: unknown 9000 1726776632.41556: variable 'ansible_host' from source: host vars for 'managed_node3' 9000 1726776632.41560: variable 'ansible_pipelining' from source: unknown 9000 1726776632.41563: variable 'ansible_timeout' from source: unknown 9000 1726776632.41567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9000 1726776632.41661: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9000 1726776632.41672: variable 'omit' from source: magic vars 9000 1726776632.41678: starting attempt loop 9000 1726776632.41681: running the handler 9000 1726776632.41691: _low_level_execute_command(): starting 9000 1726776632.41696: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9000 1726776632.44460: stdout chunk (state=2): >>>/root <<< 9000 1726776632.44469: stderr chunk (state=2): >>><<< 9000 1726776632.44482: stdout chunk (state=3): >>><<< 9000 1726776632.44496: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9000 1726776632.44508: _low_level_execute_command(): starting 9000 1726776632.44513: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776632.4450336-9000-234818331104069 `" && echo ansible-tmp-1726776632.4450336-9000-234818331104069="` echo /root/.ansible/tmp/ansible-tmp-1726776632.4450336-9000-234818331104069 `" ) && sleep 0' 9000 1726776632.46903: stdout chunk (state=2): >>>ansible-tmp-1726776632.4450336-9000-234818331104069=/root/.ansible/tmp/ansible-tmp-1726776632.4450336-9000-234818331104069 <<< 9000 1726776632.47032: stderr chunk (state=3): >>><<< 9000 1726776632.47039: stdout chunk (state=3): >>><<< 9000 1726776632.47056: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776632.4450336-9000-234818331104069=/root/.ansible/tmp/ansible-tmp-1726776632.4450336-9000-234818331104069 , stderr= 9000 1726776632.47078: variable 'ansible_module_compression' from source: unknown 9000 1726776632.47120: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 9000 1726776632.47149: variable 'ansible_facts' from source: unknown 9000 1726776632.47222: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776632.4450336-9000-234818331104069/AnsiballZ_command.py 9000 1726776632.47316: Sending initial data 9000 1726776632.47323: Sent initial data (154 bytes) 9000 1726776632.49768: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmp_dm0pb4n /root/.ansible/tmp/ansible-tmp-1726776632.4450336-9000-234818331104069/AnsiballZ_command.py <<< 9000 1726776632.50736: stderr chunk (state=3): >>><<< 9000 1726776632.50743: stdout chunk (state=3): >>><<< 9000 1726776632.50762: done transferring module to remote 9000 1726776632.50775: _low_level_execute_command(): starting 9000 1726776632.50782: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776632.4450336-9000-234818331104069/ /root/.ansible/tmp/ansible-tmp-1726776632.4450336-9000-234818331104069/AnsiballZ_command.py && sleep 0' 9000 1726776632.53123: stderr chunk (state=2): >>><<< 9000 1726776632.53135: stdout chunk (state=2): >>><<< 9000 1726776632.53150: _low_level_execute_command() done: rc=0, stdout=, stderr= 9000 1726776632.53155: _low_level_execute_command(): starting 9000 1726776632.53159: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776632.4450336-9000-234818331104069/AnsiballZ_command.py && sleep 0' 9000 1726776632.68384: stdout chunk (state=2): >>> {"changed": true, "stdout": "#\n# Ansible managed\n#\n# system_role:kernel_settings\n\n[main]\nsummary = kernel settings", "stderr": "", "rc": 0, "cmd": ["cat", "/etc/tuned/kernel_settings/tuned.conf"], "start": "2024-09-19 16:10:32.680137", "end": "2024-09-19 16:10:32.682994", "delta": "0:00:00.002857", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /etc/tuned/kernel_settings/tuned.conf", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 9000 1726776632.69547: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 9000 1726776632.69593: stderr chunk (state=3): >>><<< 9000 1726776632.69600: stdout chunk (state=3): >>><<< 9000 1726776632.69616: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "#\n# Ansible managed\n#\n# system_role:kernel_settings\n\n[main]\nsummary = kernel settings", "stderr": "", "rc": 0, "cmd": ["cat", "/etc/tuned/kernel_settings/tuned.conf"], "start": "2024-09-19 16:10:32.680137", "end": "2024-09-19 16:10:32.682994", "delta": "0:00:00.002857", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /etc/tuned/kernel_settings/tuned.conf", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.8.186 closed. 9000 1726776632.69652: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776632.4450336-9000-234818331104069/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9000 1726776632.69662: _low_level_execute_command(): starting 9000 1726776632.69668: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776632.4450336-9000-234818331104069/ > /dev/null 2>&1 && sleep 0' 9000 1726776632.72061: stderr chunk (state=2): >>><<< 9000 1726776632.72073: stdout chunk (state=2): >>><<< 9000 1726776632.72089: _low_level_execute_command() done: rc=0, stdout=, stderr= 9000 1726776632.72097: handler run complete 9000 1726776632.72122: Evaluated conditional (False): False 9000 1726776632.72137: attempt loop complete, returning result 9000 1726776632.72142: _execute() done 9000 1726776632.72145: dumping result to json 9000 1726776632.72151: done dumping result, returning 9000 1726776632.72157: done running TaskExecutor() for managed_node3/TASK: Show current tuned profile settings [120fa90a-8a95-c4e4-06a7-000000000151] 9000 1726776632.72164: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000151 9000 1726776632.72201: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000151 9000 1726776632.72205: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "cat", "/etc/tuned/kernel_settings/tuned.conf" ], "delta": "0:00:00.002857", "end": "2024-09-19 16:10:32.682994", "rc": 0, "start": "2024-09-19 16:10:32.680137" } STDOUT: # # Ansible managed # # system_role:kernel_settings [main] summary = kernel settings 8283 1726776632.72608: no more pending results, returning what we have 8283 1726776632.72611: results queue empty 8283 1726776632.72612: checking for any_errors_fatal 8283 1726776632.72614: done checking for any_errors_fatal 8283 1726776632.72614: checking for max_fail_percentage 8283 1726776632.72616: done checking for max_fail_percentage 8283 1726776632.72616: checking to see if all hosts have failed and the running result is not ok 8283 1726776632.72617: done checking to see if all hosts have failed 8283 1726776632.72617: getting the remaining hosts for this loop 8283 1726776632.72618: done getting the remaining hosts for this loop 8283 1726776632.72622: getting the next task for host managed_node3 8283 1726776632.72633: done getting next task for host managed_node3 8283 1726776632.72635: ^ task is: TASK: Run role with purge to remove everything 8283 1726776632.72638: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776632.72641: getting variables 8283 1726776632.72642: in VariableManager get_vars() 8283 1726776632.72675: Calling all_inventory to load vars for managed_node3 8283 1726776632.72678: Calling groups_inventory to load vars for managed_node3 8283 1726776632.72680: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776632.72689: Calling all_plugins_play to load vars for managed_node3 8283 1726776632.72692: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776632.72695: Calling groups_plugins_play to load vars for managed_node3 8283 1726776632.72746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776632.72786: done with get_vars() 8283 1726776632.72794: done getting variables TASK [Run role with purge to remove everything] ******************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:9 Thursday 19 September 2024 16:10:32 -0400 (0:00:00.327) 0:00:16.433 **** 8283 1726776632.72881: entering _queue_task() for managed_node3/include_role 8283 1726776632.73080: worker is 1 (out of 1 available) 8283 1726776632.73094: exiting _queue_task() for managed_node3/include_role 8283 1726776632.73105: done queuing things up, now waiting for results queue to drain 8283 1726776632.73107: waiting for pending results... 9015 1726776632.73312: running TaskExecutor() for managed_node3/TASK: Run role with purge to remove everything 9015 1726776632.73434: in run() - task 120fa90a-8a95-c4e4-06a7-000000000153 9015 1726776632.73451: variable 'ansible_search_path' from source: unknown 9015 1726776632.73455: variable 'ansible_search_path' from source: unknown 9015 1726776632.73486: calling self._execute() 9015 1726776632.73601: variable 'ansible_host' from source: host vars for 'managed_node3' 9015 1726776632.73610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9015 1726776632.73620: variable 'omit' from source: magic vars 9015 1726776632.73709: _execute() done 9015 1726776632.73715: dumping result to json 9015 1726776632.73720: done dumping result, returning 9015 1726776632.73724: done running TaskExecutor() for managed_node3/TASK: Run role with purge to remove everything [120fa90a-8a95-c4e4-06a7-000000000153] 9015 1726776632.73734: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000153 9015 1726776632.73766: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000153 9015 1726776632.73770: WORKER PROCESS EXITING 8283 1726776632.74064: no more pending results, returning what we have 8283 1726776632.74069: in VariableManager get_vars() 8283 1726776632.74145: Calling all_inventory to load vars for managed_node3 8283 1726776632.74148: Calling groups_inventory to load vars for managed_node3 8283 1726776632.74150: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776632.74158: Calling all_plugins_play to load vars for managed_node3 8283 1726776632.74160: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776632.74162: Calling groups_plugins_play to load vars for managed_node3 8283 1726776632.74205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776632.74234: done with get_vars() 8283 1726776632.74238: variable 'ansible_search_path' from source: unknown 8283 1726776632.74239: variable 'ansible_search_path' from source: unknown 8283 1726776632.74499: variable 'omit' from source: magic vars 8283 1726776632.74528: variable 'omit' from source: magic vars 8283 1726776632.74543: variable 'omit' from source: magic vars 8283 1726776632.74546: we have included files to process 8283 1726776632.74546: generating all_blocks data 8283 1726776632.74548: done generating all_blocks data 8283 1726776632.74552: processing included file: fedora.linux_system_roles.kernel_settings 8283 1726776632.74571: in VariableManager get_vars() 8283 1726776632.74584: done with get_vars() 8283 1726776632.74608: in VariableManager get_vars() 8283 1726776632.74623: done with get_vars() 8283 1726776632.74657: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/main.yml 8283 1726776632.74712: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/defaults/main.yml 8283 1726776632.74735: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/meta/main.yml 8283 1726776632.74807: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml 8283 1726776632.75335: in VariableManager get_vars() 8283 1726776632.75356: done with get_vars() 8283 1726776632.76603: in VariableManager get_vars() 8283 1726776632.76625: done with get_vars() 8283 1726776632.76781: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/handlers/main.yml 8283 1726776632.77448: iterating over new_blocks loaded from include file 8283 1726776632.77450: in VariableManager get_vars() 8283 1726776632.77485: done with get_vars() 8283 1726776632.77487: filtering new block on tags 8283 1726776632.77506: done filtering new block on tags 8283 1726776632.77508: in VariableManager get_vars() 8283 1726776632.77523: done with get_vars() 8283 1726776632.77525: filtering new block on tags 8283 1726776632.77546: done filtering new block on tags 8283 1726776632.77548: in VariableManager get_vars() 8283 1726776632.77563: done with get_vars() 8283 1726776632.77565: filtering new block on tags 8283 1726776632.77610: done filtering new block on tags 8283 1726776632.77612: in VariableManager get_vars() 8283 1726776632.77630: done with get_vars() 8283 1726776632.77631: filtering new block on tags 8283 1726776632.77647: done filtering new block on tags 8283 1726776632.77649: done iterating over new_blocks loaded from include file 8283 1726776632.77650: extending task lists for all hosts with included blocks 8283 1726776632.77934: done extending task lists 8283 1726776632.77935: done processing included files 8283 1726776632.77936: results queue empty 8283 1726776632.77936: checking for any_errors_fatal 8283 1726776632.77941: done checking for any_errors_fatal 8283 1726776632.77941: checking for max_fail_percentage 8283 1726776632.77942: done checking for max_fail_percentage 8283 1726776632.77943: checking to see if all hosts have failed and the running result is not ok 8283 1726776632.77944: done checking to see if all hosts have failed 8283 1726776632.77944: getting the remaining hosts for this loop 8283 1726776632.77945: done getting the remaining hosts for this loop 8283 1726776632.77947: getting the next task for host managed_node3 8283 1726776632.77952: done getting next task for host managed_node3 8283 1726776632.77954: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8283 1726776632.77957: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776632.77966: getting variables 8283 1726776632.77967: in VariableManager get_vars() 8283 1726776632.77979: Calling all_inventory to load vars for managed_node3 8283 1726776632.77982: Calling groups_inventory to load vars for managed_node3 8283 1726776632.77984: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776632.77989: Calling all_plugins_play to load vars for managed_node3 8283 1726776632.77991: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776632.77994: Calling groups_plugins_play to load vars for managed_node3 8283 1726776632.78028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776632.78067: done with get_vars() 8283 1726776632.78073: done getting variables 8283 1726776632.78111: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:2 Thursday 19 September 2024 16:10:32 -0400 (0:00:00.052) 0:00:16.486 **** 8283 1726776632.78151: entering _queue_task() for managed_node3/fail 8283 1726776632.78382: worker is 1 (out of 1 available) 8283 1726776632.78394: exiting _queue_task() for managed_node3/fail 8283 1726776632.78406: done queuing things up, now waiting for results queue to drain 8283 1726776632.78408: waiting for pending results... 9016 1726776632.78620: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 9016 1726776632.78761: in run() - task 120fa90a-8a95-c4e4-06a7-0000000001f5 9016 1726776632.78778: variable 'ansible_search_path' from source: unknown 9016 1726776632.78786: variable 'ansible_search_path' from source: unknown 9016 1726776632.78821: calling self._execute() 9016 1726776632.78890: variable 'ansible_host' from source: host vars for 'managed_node3' 9016 1726776632.78901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9016 1726776632.78911: variable 'omit' from source: magic vars 9016 1726776632.79373: variable 'kernel_settings_sysctl' from source: include params 9016 1726776632.79387: variable '__kernel_settings_state_empty' from source: role '' all vars 9016 1726776632.79401: Evaluated conditional (kernel_settings_sysctl != __kernel_settings_state_empty): True 9016 1726776632.79635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9016 1726776632.81159: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9016 1726776632.81208: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9016 1726776632.81240: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9016 1726776632.81275: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9016 1726776632.81298: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9016 1726776632.81356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9016 1726776632.81388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9016 1726776632.81409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9016 1726776632.81452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9016 1726776632.81466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9016 1726776632.81511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9016 1726776632.81532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9016 1726776632.81555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9016 1726776632.81578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9016 1726776632.81595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9016 1726776632.81636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9016 1726776632.81657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9016 1726776632.81678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9016 1726776632.81718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9016 1726776632.81734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9016 1726776632.82002: variable 'kernel_settings_sysctl' from source: include params 9016 1726776632.82031: Evaluated conditional ((kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", true) | list | length > 0) or (kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", false) | list | length > 0)): False 9016 1726776632.82037: when evaluation is False, skipping this task 9016 1726776632.82040: _execute() done 9016 1726776632.82043: dumping result to json 9016 1726776632.82046: done dumping result, returning 9016 1726776632.82056: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values [120fa90a-8a95-c4e4-06a7-0000000001f5] 9016 1726776632.82063: sending task result for task 120fa90a-8a95-c4e4-06a7-0000000001f5 9016 1726776632.82091: done sending task result for task 120fa90a-8a95-c4e4-06a7-0000000001f5 9016 1726776632.82094: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "(kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", true) | list | length > 0) or (kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", false) | list | length > 0)", "skip_reason": "Conditional result was False" } 8283 1726776632.82492: no more pending results, returning what we have 8283 1726776632.82495: results queue empty 8283 1726776632.82496: checking for any_errors_fatal 8283 1726776632.82497: done checking for any_errors_fatal 8283 1726776632.82498: checking for max_fail_percentage 8283 1726776632.82501: done checking for max_fail_percentage 8283 1726776632.82501: checking to see if all hosts have failed and the running result is not ok 8283 1726776632.82502: done checking to see if all hosts have failed 8283 1726776632.82502: getting the remaining hosts for this loop 8283 1726776632.82504: done getting the remaining hosts for this loop 8283 1726776632.82507: getting the next task for host managed_node3 8283 1726776632.82513: done getting next task for host managed_node3 8283 1726776632.82517: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8283 1726776632.82521: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776632.82543: getting variables 8283 1726776632.82545: in VariableManager get_vars() 8283 1726776632.82570: Calling all_inventory to load vars for managed_node3 8283 1726776632.82572: Calling groups_inventory to load vars for managed_node3 8283 1726776632.82574: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776632.82580: Calling all_plugins_play to load vars for managed_node3 8283 1726776632.82583: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776632.82585: Calling groups_plugins_play to load vars for managed_node3 8283 1726776632.82632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776632.82685: done with get_vars() 8283 1726776632.82692: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Set version specific variables] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:9 Thursday 19 September 2024 16:10:32 -0400 (0:00:00.046) 0:00:16.532 **** 8283 1726776632.82783: entering _queue_task() for managed_node3/include_tasks 8283 1726776632.82986: worker is 1 (out of 1 available) 8283 1726776632.83003: exiting _queue_task() for managed_node3/include_tasks 8283 1726776632.83014: done queuing things up, now waiting for results queue to drain 8283 1726776632.83016: waiting for pending results... 9019 1726776632.83225: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 9019 1726776632.83359: in run() - task 120fa90a-8a95-c4e4-06a7-0000000001f6 9019 1726776632.83377: variable 'ansible_search_path' from source: unknown 9019 1726776632.83384: variable 'ansible_search_path' from source: unknown 9019 1726776632.83414: calling self._execute() 9019 1726776632.83486: variable 'ansible_host' from source: host vars for 'managed_node3' 9019 1726776632.83496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9019 1726776632.83504: variable 'omit' from source: magic vars 9019 1726776632.83597: _execute() done 9019 1726776632.83602: dumping result to json 9019 1726776632.83605: done dumping result, returning 9019 1726776632.83609: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables [120fa90a-8a95-c4e4-06a7-0000000001f6] 9019 1726776632.83615: sending task result for task 120fa90a-8a95-c4e4-06a7-0000000001f6 9019 1726776632.83637: done sending task result for task 120fa90a-8a95-c4e4-06a7-0000000001f6 9019 1726776632.83640: WORKER PROCESS EXITING 8283 1726776632.83876: no more pending results, returning what we have 8283 1726776632.83879: in VariableManager get_vars() 8283 1726776632.83905: Calling all_inventory to load vars for managed_node3 8283 1726776632.83907: Calling groups_inventory to load vars for managed_node3 8283 1726776632.83908: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776632.83914: Calling all_plugins_play to load vars for managed_node3 8283 1726776632.83916: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776632.83917: Calling groups_plugins_play to load vars for managed_node3 8283 1726776632.83961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776632.83990: done with get_vars() 8283 1726776632.83994: variable 'ansible_search_path' from source: unknown 8283 1726776632.83995: variable 'ansible_search_path' from source: unknown 8283 1726776632.84016: we have included files to process 8283 1726776632.84017: generating all_blocks data 8283 1726776632.84018: done generating all_blocks data 8283 1726776632.84022: processing included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8283 1726776632.84022: loading included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8283 1726776632.84023: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml included: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml for managed_node3 8283 1726776632.84506: done processing included file 8283 1726776632.84508: iterating over new_blocks loaded from include file 8283 1726776632.84509: in VariableManager get_vars() 8283 1726776632.84523: done with get_vars() 8283 1726776632.84525: filtering new block on tags 8283 1726776632.84537: done filtering new block on tags 8283 1726776632.84538: in VariableManager get_vars() 8283 1726776632.84552: done with get_vars() 8283 1726776632.84553: filtering new block on tags 8283 1726776632.84564: done filtering new block on tags 8283 1726776632.84565: in VariableManager get_vars() 8283 1726776632.84578: done with get_vars() 8283 1726776632.84579: filtering new block on tags 8283 1726776632.84591: done filtering new block on tags 8283 1726776632.84593: in VariableManager get_vars() 8283 1726776632.84607: done with get_vars() 8283 1726776632.84608: filtering new block on tags 8283 1726776632.84617: done filtering new block on tags 8283 1726776632.84618: done iterating over new_blocks loaded from include file 8283 1726776632.84618: extending task lists for all hosts with included blocks 8283 1726776632.84753: done extending task lists 8283 1726776632.84753: done processing included files 8283 1726776632.84754: results queue empty 8283 1726776632.84754: checking for any_errors_fatal 8283 1726776632.84757: done checking for any_errors_fatal 8283 1726776632.84757: checking for max_fail_percentage 8283 1726776632.84758: done checking for max_fail_percentage 8283 1726776632.84758: checking to see if all hosts have failed and the running result is not ok 8283 1726776632.84758: done checking to see if all hosts have failed 8283 1726776632.84759: getting the remaining hosts for this loop 8283 1726776632.84759: done getting the remaining hosts for this loop 8283 1726776632.84761: getting the next task for host managed_node3 8283 1726776632.84764: done getting next task for host managed_node3 8283 1726776632.84765: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8283 1726776632.84767: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776632.84774: getting variables 8283 1726776632.84775: in VariableManager get_vars() 8283 1726776632.84785: Calling all_inventory to load vars for managed_node3 8283 1726776632.84786: Calling groups_inventory to load vars for managed_node3 8283 1726776632.84788: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776632.84791: Calling all_plugins_play to load vars for managed_node3 8283 1726776632.84792: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776632.84794: Calling groups_plugins_play to load vars for managed_node3 8283 1726776632.84815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776632.84841: done with get_vars() 8283 1726776632.84846: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:2 Thursday 19 September 2024 16:10:32 -0400 (0:00:00.021) 0:00:16.553 **** 8283 1726776632.84899: entering _queue_task() for managed_node3/setup 8283 1726776632.85053: worker is 1 (out of 1 available) 8283 1726776632.85066: exiting _queue_task() for managed_node3/setup 8283 1726776632.85077: done queuing things up, now waiting for results queue to drain 8283 1726776632.85079: waiting for pending results... 9021 1726776632.85187: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 9021 1726776632.85298: in run() - task 120fa90a-8a95-c4e4-06a7-000000000271 9021 1726776632.85312: variable 'ansible_search_path' from source: unknown 9021 1726776632.85316: variable 'ansible_search_path' from source: unknown 9021 1726776632.85344: calling self._execute() 9021 1726776632.85390: variable 'ansible_host' from source: host vars for 'managed_node3' 9021 1726776632.85398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9021 1726776632.85407: variable 'omit' from source: magic vars 9021 1726776632.85797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9021 1726776632.87550: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9021 1726776632.87741: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9021 1726776632.87771: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9021 1726776632.87798: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9021 1726776632.87820: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9021 1726776632.87876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9021 1726776632.87899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9021 1726776632.87918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9021 1726776632.87949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9021 1726776632.87962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9021 1726776632.88000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9021 1726776632.88017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9021 1726776632.88036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9021 1726776632.88063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9021 1726776632.88074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9021 1726776632.88192: variable '__kernel_settings_required_facts' from source: role '' all vars 9021 1726776632.88203: variable 'ansible_facts' from source: unknown 9021 1726776632.88233: Evaluated conditional (__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 9021 1726776632.88238: when evaluation is False, skipping this task 9021 1726776632.88241: _execute() done 9021 1726776632.88245: dumping result to json 9021 1726776632.88248: done dumping result, returning 9021 1726776632.88255: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role [120fa90a-8a95-c4e4-06a7-000000000271] 9021 1726776632.88261: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000271 9021 1726776632.88280: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000271 9021 1726776632.88283: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } 8283 1726776632.88427: no more pending results, returning what we have 8283 1726776632.88431: results queue empty 8283 1726776632.88432: checking for any_errors_fatal 8283 1726776632.88433: done checking for any_errors_fatal 8283 1726776632.88434: checking for max_fail_percentage 8283 1726776632.88435: done checking for max_fail_percentage 8283 1726776632.88436: checking to see if all hosts have failed and the running result is not ok 8283 1726776632.88436: done checking to see if all hosts have failed 8283 1726776632.88437: getting the remaining hosts for this loop 8283 1726776632.88438: done getting the remaining hosts for this loop 8283 1726776632.88441: getting the next task for host managed_node3 8283 1726776632.88450: done getting next task for host managed_node3 8283 1726776632.88453: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8283 1726776632.88457: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776632.88470: getting variables 8283 1726776632.88471: in VariableManager get_vars() 8283 1726776632.88506: Calling all_inventory to load vars for managed_node3 8283 1726776632.88509: Calling groups_inventory to load vars for managed_node3 8283 1726776632.88511: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776632.88519: Calling all_plugins_play to load vars for managed_node3 8283 1726776632.88521: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776632.88524: Calling groups_plugins_play to load vars for managed_node3 8283 1726776632.88571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776632.88609: done with get_vars() 8283 1726776632.88615: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if system is ostree] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:10 Thursday 19 September 2024 16:10:32 -0400 (0:00:00.037) 0:00:16.591 **** 8283 1726776632.88690: entering _queue_task() for managed_node3/stat 8283 1726776632.88855: worker is 1 (out of 1 available) 8283 1726776632.88869: exiting _queue_task() for managed_node3/stat 8283 1726776632.88883: done queuing things up, now waiting for results queue to drain 8283 1726776632.88885: waiting for pending results... 9023 1726776632.88999: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 9023 1726776632.89116: in run() - task 120fa90a-8a95-c4e4-06a7-000000000273 9023 1726776632.89133: variable 'ansible_search_path' from source: unknown 9023 1726776632.89137: variable 'ansible_search_path' from source: unknown 9023 1726776632.89164: calling self._execute() 9023 1726776632.89271: variable 'ansible_host' from source: host vars for 'managed_node3' 9023 1726776632.89280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9023 1726776632.89289: variable 'omit' from source: magic vars 9023 1726776632.89603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9023 1726776632.89772: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9023 1726776632.89806: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9023 1726776632.89838: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9023 1726776632.89863: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9023 1726776632.89923: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9023 1726776632.89944: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9023 1726776632.89963: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9023 1726776632.89983: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9023 1726776632.90067: variable '__kernel_settings_is_ostree' from source: set_fact 9023 1726776632.90079: Evaluated conditional (not __kernel_settings_is_ostree is defined): False 9023 1726776632.90084: when evaluation is False, skipping this task 9023 1726776632.90087: _execute() done 9023 1726776632.90089: dumping result to json 9023 1726776632.90092: done dumping result, returning 9023 1726776632.90096: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree [120fa90a-8a95-c4e4-06a7-000000000273] 9023 1726776632.90101: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000273 9023 1726776632.90120: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000273 9023 1726776632.90122: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __kernel_settings_is_ostree is defined", "skip_reason": "Conditional result was False" } 8283 1726776632.90544: no more pending results, returning what we have 8283 1726776632.90547: results queue empty 8283 1726776632.90548: checking for any_errors_fatal 8283 1726776632.90552: done checking for any_errors_fatal 8283 1726776632.90552: checking for max_fail_percentage 8283 1726776632.90554: done checking for max_fail_percentage 8283 1726776632.90554: checking to see if all hosts have failed and the running result is not ok 8283 1726776632.90555: done checking to see if all hosts have failed 8283 1726776632.90555: getting the remaining hosts for this loop 8283 1726776632.90557: done getting the remaining hosts for this loop 8283 1726776632.90561: getting the next task for host managed_node3 8283 1726776632.90567: done getting next task for host managed_node3 8283 1726776632.90570: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8283 1726776632.90574: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776632.90586: getting variables 8283 1726776632.90588: in VariableManager get_vars() 8283 1726776632.90614: Calling all_inventory to load vars for managed_node3 8283 1726776632.90617: Calling groups_inventory to load vars for managed_node3 8283 1726776632.90619: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776632.90627: Calling all_plugins_play to load vars for managed_node3 8283 1726776632.90631: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776632.90634: Calling groups_plugins_play to load vars for managed_node3 8283 1726776632.90682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776632.90725: done with get_vars() 8283 1726776632.90736: done getting variables 8283 1726776632.90793: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:15 Thursday 19 September 2024 16:10:32 -0400 (0:00:00.021) 0:00:16.613 **** 8283 1726776632.90830: entering _queue_task() for managed_node3/set_fact 8283 1726776632.90991: worker is 1 (out of 1 available) 8283 1726776632.91004: exiting _queue_task() for managed_node3/set_fact 8283 1726776632.91015: done queuing things up, now waiting for results queue to drain 8283 1726776632.91017: waiting for pending results... 9029 1726776632.91165: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 9029 1726776632.91288: in run() - task 120fa90a-8a95-c4e4-06a7-000000000274 9029 1726776632.91304: variable 'ansible_search_path' from source: unknown 9029 1726776632.91308: variable 'ansible_search_path' from source: unknown 9029 1726776632.91335: calling self._execute() 9029 1726776632.91386: variable 'ansible_host' from source: host vars for 'managed_node3' 9029 1726776632.91395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9029 1726776632.91404: variable 'omit' from source: magic vars 9029 1726776632.91749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9029 1726776632.91920: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9029 1726776632.91955: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9029 1726776632.91978: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9029 1726776632.92006: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9029 1726776632.92068: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9029 1726776632.92090: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9029 1726776632.92109: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9029 1726776632.92127: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9029 1726776632.92212: variable '__kernel_settings_is_ostree' from source: set_fact 9029 1726776632.92223: Evaluated conditional (not __kernel_settings_is_ostree is defined): False 9029 1726776632.92227: when evaluation is False, skipping this task 9029 1726776632.92233: _execute() done 9029 1726776632.92238: dumping result to json 9029 1726776632.92241: done dumping result, returning 9029 1726776632.92247: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree [120fa90a-8a95-c4e4-06a7-000000000274] 9029 1726776632.92254: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000274 9029 1726776632.92276: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000274 9029 1726776632.92280: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __kernel_settings_is_ostree is defined", "skip_reason": "Conditional result was False" } 8283 1726776632.92402: no more pending results, returning what we have 8283 1726776632.92405: results queue empty 8283 1726776632.92406: checking for any_errors_fatal 8283 1726776632.92410: done checking for any_errors_fatal 8283 1726776632.92410: checking for max_fail_percentage 8283 1726776632.92411: done checking for max_fail_percentage 8283 1726776632.92412: checking to see if all hosts have failed and the running result is not ok 8283 1726776632.92413: done checking to see if all hosts have failed 8283 1726776632.92413: getting the remaining hosts for this loop 8283 1726776632.92415: done getting the remaining hosts for this loop 8283 1726776632.92418: getting the next task for host managed_node3 8283 1726776632.92426: done getting next task for host managed_node3 8283 1726776632.92431: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8283 1726776632.92434: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776632.92446: getting variables 8283 1726776632.92447: in VariableManager get_vars() 8283 1726776632.92479: Calling all_inventory to load vars for managed_node3 8283 1726776632.92482: Calling groups_inventory to load vars for managed_node3 8283 1726776632.92484: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776632.92493: Calling all_plugins_play to load vars for managed_node3 8283 1726776632.92496: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776632.92498: Calling groups_plugins_play to load vars for managed_node3 8283 1726776632.92550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776632.92601: done with get_vars() 8283 1726776632.92609: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:22 Thursday 19 September 2024 16:10:32 -0400 (0:00:00.018) 0:00:16.631 **** 8283 1726776632.92701: entering _queue_task() for managed_node3/stat 8283 1726776632.92887: worker is 1 (out of 1 available) 8283 1726776632.92899: exiting _queue_task() for managed_node3/stat 8283 1726776632.92909: done queuing things up, now waiting for results queue to drain 8283 1726776632.92911: waiting for pending results... 9033 1726776632.93029: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 9033 1726776632.93150: in run() - task 120fa90a-8a95-c4e4-06a7-000000000276 9033 1726776632.93166: variable 'ansible_search_path' from source: unknown 9033 1726776632.93170: variable 'ansible_search_path' from source: unknown 9033 1726776632.93199: calling self._execute() 9033 1726776632.93254: variable 'ansible_host' from source: host vars for 'managed_node3' 9033 1726776632.93263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9033 1726776632.93272: variable 'omit' from source: magic vars 9033 1726776632.93606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9033 1726776632.93784: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9033 1726776632.93818: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9033 1726776632.93845: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9033 1726776632.93907: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9033 1726776632.93973: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9033 1726776632.93996: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9033 1726776632.94016: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9033 1726776632.94039: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9033 1726776632.94126: variable '__kernel_settings_is_transactional' from source: set_fact 9033 1726776632.94140: Evaluated conditional (not __kernel_settings_is_transactional is defined): False 9033 1726776632.94144: when evaluation is False, skipping this task 9033 1726776632.94148: _execute() done 9033 1726776632.94152: dumping result to json 9033 1726776632.94156: done dumping result, returning 9033 1726776632.94163: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin [120fa90a-8a95-c4e4-06a7-000000000276] 9033 1726776632.94169: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000276 9033 1726776632.94195: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000276 9033 1726776632.94198: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __kernel_settings_is_transactional is defined", "skip_reason": "Conditional result was False" } 8283 1726776632.94308: no more pending results, returning what we have 8283 1726776632.94311: results queue empty 8283 1726776632.94312: checking for any_errors_fatal 8283 1726776632.94316: done checking for any_errors_fatal 8283 1726776632.94317: checking for max_fail_percentage 8283 1726776632.94318: done checking for max_fail_percentage 8283 1726776632.94319: checking to see if all hosts have failed and the running result is not ok 8283 1726776632.94320: done checking to see if all hosts have failed 8283 1726776632.94320: getting the remaining hosts for this loop 8283 1726776632.94321: done getting the remaining hosts for this loop 8283 1726776632.94324: getting the next task for host managed_node3 8283 1726776632.94332: done getting next task for host managed_node3 8283 1726776632.94336: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8283 1726776632.94340: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776632.94353: getting variables 8283 1726776632.94354: in VariableManager get_vars() 8283 1726776632.94386: Calling all_inventory to load vars for managed_node3 8283 1726776632.94389: Calling groups_inventory to load vars for managed_node3 8283 1726776632.94390: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776632.94398: Calling all_plugins_play to load vars for managed_node3 8283 1726776632.94400: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776632.94402: Calling groups_plugins_play to load vars for managed_node3 8283 1726776632.94444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776632.94479: done with get_vars() 8283 1726776632.94487: done getting variables 8283 1726776632.94524: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:27 Thursday 19 September 2024 16:10:32 -0400 (0:00:00.018) 0:00:16.650 **** 8283 1726776632.94551: entering _queue_task() for managed_node3/set_fact 8283 1726776632.94711: worker is 1 (out of 1 available) 8283 1726776632.94725: exiting _queue_task() for managed_node3/set_fact 8283 1726776632.94739: done queuing things up, now waiting for results queue to drain 8283 1726776632.94741: waiting for pending results... 9034 1726776632.94858: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 9034 1726776632.94973: in run() - task 120fa90a-8a95-c4e4-06a7-000000000277 9034 1726776632.94991: variable 'ansible_search_path' from source: unknown 9034 1726776632.94995: variable 'ansible_search_path' from source: unknown 9034 1726776632.95021: calling self._execute() 9034 1726776632.95071: variable 'ansible_host' from source: host vars for 'managed_node3' 9034 1726776632.95080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9034 1726776632.95091: variable 'omit' from source: magic vars 9034 1726776632.95410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9034 1726776632.95618: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9034 1726776632.95653: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9034 1726776632.95676: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9034 1726776632.95704: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9034 1726776632.95766: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9034 1726776632.95788: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9034 1726776632.95806: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9034 1726776632.95824: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9034 1726776632.95910: variable '__kernel_settings_is_transactional' from source: set_fact 9034 1726776632.95921: Evaluated conditional (not __kernel_settings_is_transactional is defined): False 9034 1726776632.95925: when evaluation is False, skipping this task 9034 1726776632.95930: _execute() done 9034 1726776632.95935: dumping result to json 9034 1726776632.95937: done dumping result, returning 9034 1726776632.95941: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists [120fa90a-8a95-c4e4-06a7-000000000277] 9034 1726776632.95946: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000277 9034 1726776632.95965: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000277 9034 1726776632.95967: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __kernel_settings_is_transactional is defined", "skip_reason": "Conditional result was False" } 8283 1726776632.96177: no more pending results, returning what we have 8283 1726776632.96180: results queue empty 8283 1726776632.96180: checking for any_errors_fatal 8283 1726776632.96184: done checking for any_errors_fatal 8283 1726776632.96184: checking for max_fail_percentage 8283 1726776632.96185: done checking for max_fail_percentage 8283 1726776632.96186: checking to see if all hosts have failed and the running result is not ok 8283 1726776632.96186: done checking to see if all hosts have failed 8283 1726776632.96186: getting the remaining hosts for this loop 8283 1726776632.96187: done getting the remaining hosts for this loop 8283 1726776632.96190: getting the next task for host managed_node3 8283 1726776632.96196: done getting next task for host managed_node3 8283 1726776632.96198: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8283 1726776632.96201: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776632.96210: getting variables 8283 1726776632.96211: in VariableManager get_vars() 8283 1726776632.96235: Calling all_inventory to load vars for managed_node3 8283 1726776632.96237: Calling groups_inventory to load vars for managed_node3 8283 1726776632.96238: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776632.96244: Calling all_plugins_play to load vars for managed_node3 8283 1726776632.96245: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776632.96247: Calling groups_plugins_play to load vars for managed_node3 8283 1726776632.96281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776632.96315: done with get_vars() 8283 1726776632.96320: done getting variables 8283 1726776632.96359: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set platform/version specific variables] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:31 Thursday 19 September 2024 16:10:32 -0400 (0:00:00.018) 0:00:16.668 **** 8283 1726776632.96385: entering _queue_task() for managed_node3/include_vars 8283 1726776632.96538: worker is 1 (out of 1 available) 8283 1726776632.96552: exiting _queue_task() for managed_node3/include_vars 8283 1726776632.96564: done queuing things up, now waiting for results queue to drain 8283 1726776632.96565: waiting for pending results... 9035 1726776632.96674: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 9035 1726776632.96789: in run() - task 120fa90a-8a95-c4e4-06a7-000000000279 9035 1726776632.96805: variable 'ansible_search_path' from source: unknown 9035 1726776632.96809: variable 'ansible_search_path' from source: unknown 9035 1726776632.96835: calling self._execute() 9035 1726776632.96883: variable 'ansible_host' from source: host vars for 'managed_node3' 9035 1726776632.96892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9035 1726776632.96900: variable 'omit' from source: magic vars 9035 1726776632.96971: variable 'omit' from source: magic vars 9035 1726776632.97019: variable 'omit' from source: magic vars 9035 1726776632.97314: variable 'ffparams' from source: task vars 9035 1726776632.97409: variable 'ansible_facts' from source: unknown 9035 1726776632.97506: variable 'ansible_facts' from source: unknown 9035 1726776632.97568: variable 'ansible_facts' from source: unknown 9035 1726776632.97631: variable 'ansible_facts' from source: unknown 9035 1726776632.97684: variable 'role_path' from source: magic vars 9035 1726776632.97803: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 9035 1726776632.97951: Loaded config def from plugin (lookup/first_found) 9035 1726776632.97959: Loading LookupModule 'first_found' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/first_found.py 9035 1726776632.97989: variable 'ansible_search_path' from source: unknown 9035 1726776632.98008: variable 'ansible_search_path' from source: unknown 9035 1726776632.98017: variable 'ansible_search_path' from source: unknown 9035 1726776632.98024: variable 'ansible_search_path' from source: unknown 9035 1726776632.98032: variable 'ansible_search_path' from source: unknown 9035 1726776632.98048: variable 'omit' from source: magic vars 9035 1726776632.98068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9035 1726776632.98088: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9035 1726776632.98104: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9035 1726776632.98118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9035 1726776632.98130: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9035 1726776632.98151: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9035 1726776632.98156: variable 'ansible_host' from source: host vars for 'managed_node3' 9035 1726776632.98160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9035 1726776632.98224: Set connection var ansible_module_compression to ZIP_DEFLATED 9035 1726776632.98233: Set connection var ansible_shell_type to sh 9035 1726776632.98240: Set connection var ansible_timeout to 10 9035 1726776632.98245: Set connection var ansible_connection to ssh 9035 1726776632.98252: Set connection var ansible_pipelining to False 9035 1726776632.98258: Set connection var ansible_shell_executable to /bin/sh 9035 1726776632.98273: variable 'ansible_shell_executable' from source: unknown 9035 1726776632.98277: variable 'ansible_connection' from source: unknown 9035 1726776632.98283: variable 'ansible_module_compression' from source: unknown 9035 1726776632.98286: variable 'ansible_shell_type' from source: unknown 9035 1726776632.98289: variable 'ansible_shell_executable' from source: unknown 9035 1726776632.98292: variable 'ansible_host' from source: host vars for 'managed_node3' 9035 1726776632.98296: variable 'ansible_pipelining' from source: unknown 9035 1726776632.98299: variable 'ansible_timeout' from source: unknown 9035 1726776632.98304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9035 1726776632.98368: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9035 1726776632.98376: variable 'omit' from source: magic vars 9035 1726776632.98379: starting attempt loop 9035 1726776632.98384: running the handler 9035 1726776632.98421: handler run complete 9035 1726776632.98431: attempt loop complete, returning result 9035 1726776632.98433: _execute() done 9035 1726776632.98436: dumping result to json 9035 1726776632.98438: done dumping result, returning 9035 1726776632.98443: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables [120fa90a-8a95-c4e4-06a7-000000000279] 9035 1726776632.98448: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000279 9035 1726776632.98468: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000279 9035 1726776632.98470: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__kernel_settings_packages": [ "tuned", "python3-configobj" ], "__kernel_settings_services": [ "tuned" ] }, "ansible_included_var_files": [ "/tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/default.yml" ], "changed": false } 8283 1726776632.98685: no more pending results, returning what we have 8283 1726776632.98687: results queue empty 8283 1726776632.98688: checking for any_errors_fatal 8283 1726776632.98690: done checking for any_errors_fatal 8283 1726776632.98691: checking for max_fail_percentage 8283 1726776632.98692: done checking for max_fail_percentage 8283 1726776632.98692: checking to see if all hosts have failed and the running result is not ok 8283 1726776632.98692: done checking to see if all hosts have failed 8283 1726776632.98693: getting the remaining hosts for this loop 8283 1726776632.98693: done getting the remaining hosts for this loop 8283 1726776632.98696: getting the next task for host managed_node3 8283 1726776632.98701: done getting next task for host managed_node3 8283 1726776632.98703: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8283 1726776632.98706: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776632.98712: getting variables 8283 1726776632.98713: in VariableManager get_vars() 8283 1726776632.98737: Calling all_inventory to load vars for managed_node3 8283 1726776632.98739: Calling groups_inventory to load vars for managed_node3 8283 1726776632.98740: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776632.98750: Calling all_plugins_play to load vars for managed_node3 8283 1726776632.98752: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776632.98753: Calling groups_plugins_play to load vars for managed_node3 8283 1726776632.98786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776632.98815: done with get_vars() 8283 1726776632.98821: done getting variables 8283 1726776632.98859: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required packages are installed] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 Thursday 19 September 2024 16:10:32 -0400 (0:00:00.024) 0:00:16.693 **** 8283 1726776632.98881: entering _queue_task() for managed_node3/package 8283 1726776632.99028: worker is 1 (out of 1 available) 8283 1726776632.99043: exiting _queue_task() for managed_node3/package 8283 1726776632.99054: done queuing things up, now waiting for results queue to drain 8283 1726776632.99055: waiting for pending results... 9036 1726776632.99164: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 9036 1726776632.99264: in run() - task 120fa90a-8a95-c4e4-06a7-0000000001f7 9036 1726776632.99279: variable 'ansible_search_path' from source: unknown 9036 1726776632.99285: variable 'ansible_search_path' from source: unknown 9036 1726776632.99310: calling self._execute() 9036 1726776632.99416: variable 'ansible_host' from source: host vars for 'managed_node3' 9036 1726776632.99424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9036 1726776632.99433: variable 'omit' from source: magic vars 9036 1726776632.99501: variable 'omit' from source: magic vars 9036 1726776632.99539: variable 'omit' from source: magic vars 9036 1726776632.99560: variable '__kernel_settings_packages' from source: include_vars 9036 1726776632.99755: variable '__kernel_settings_packages' from source: include_vars 9036 1726776632.99906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9036 1726776633.01447: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9036 1726776633.01490: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9036 1726776633.01518: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9036 1726776633.01557: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9036 1726776633.01578: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9036 1726776633.01643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9036 1726776633.01665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9036 1726776633.01684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9036 1726776633.01711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9036 1726776633.01722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9036 1726776633.01791: variable '__kernel_settings_is_ostree' from source: set_fact 9036 1726776633.01798: variable 'omit' from source: magic vars 9036 1726776633.01817: variable 'omit' from source: magic vars 9036 1726776633.01840: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9036 1726776633.01861: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9036 1726776633.01876: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9036 1726776633.01890: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9036 1726776633.01900: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9036 1726776633.01922: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9036 1726776633.01926: variable 'ansible_host' from source: host vars for 'managed_node3' 9036 1726776633.01933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9036 1726776633.01995: Set connection var ansible_module_compression to ZIP_DEFLATED 9036 1726776633.02003: Set connection var ansible_shell_type to sh 9036 1726776633.02009: Set connection var ansible_timeout to 10 9036 1726776633.02015: Set connection var ansible_connection to ssh 9036 1726776633.02021: Set connection var ansible_pipelining to False 9036 1726776633.02027: Set connection var ansible_shell_executable to /bin/sh 9036 1726776633.02043: variable 'ansible_shell_executable' from source: unknown 9036 1726776633.02047: variable 'ansible_connection' from source: unknown 9036 1726776633.02050: variable 'ansible_module_compression' from source: unknown 9036 1726776633.02053: variable 'ansible_shell_type' from source: unknown 9036 1726776633.02056: variable 'ansible_shell_executable' from source: unknown 9036 1726776633.02058: variable 'ansible_host' from source: host vars for 'managed_node3' 9036 1726776633.02060: variable 'ansible_pipelining' from source: unknown 9036 1726776633.02062: variable 'ansible_timeout' from source: unknown 9036 1726776633.02064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9036 1726776633.02117: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9036 1726776633.02124: variable 'omit' from source: magic vars 9036 1726776633.02128: starting attempt loop 9036 1726776633.02132: running the handler 9036 1726776633.02185: variable 'ansible_facts' from source: unknown 9036 1726776633.02210: _low_level_execute_command(): starting 9036 1726776633.02216: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9036 1726776633.04505: stdout chunk (state=2): >>>/root <<< 9036 1726776633.04620: stderr chunk (state=3): >>><<< 9036 1726776633.04626: stdout chunk (state=3): >>><<< 9036 1726776633.04647: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9036 1726776633.04663: _low_level_execute_command(): starting 9036 1726776633.04668: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776633.0465884-9036-86385017328313 `" && echo ansible-tmp-1726776633.0465884-9036-86385017328313="` echo /root/.ansible/tmp/ansible-tmp-1726776633.0465884-9036-86385017328313 `" ) && sleep 0' 9036 1726776633.07034: stdout chunk (state=2): >>>ansible-tmp-1726776633.0465884-9036-86385017328313=/root/.ansible/tmp/ansible-tmp-1726776633.0465884-9036-86385017328313 <<< 9036 1726776633.07157: stderr chunk (state=3): >>><<< 9036 1726776633.07165: stdout chunk (state=3): >>><<< 9036 1726776633.07183: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776633.0465884-9036-86385017328313=/root/.ansible/tmp/ansible-tmp-1726776633.0465884-9036-86385017328313 , stderr= 9036 1726776633.07211: variable 'ansible_module_compression' from source: unknown 9036 1726776633.07259: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 9036 1726776633.07312: variable 'ansible_facts' from source: unknown 9036 1726776633.07470: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776633.0465884-9036-86385017328313/AnsiballZ_setup.py 9036 1726776633.07567: Sending initial data 9036 1726776633.07574: Sent initial data (151 bytes) 9036 1726776633.09954: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpr8t3ofz5 /root/.ansible/tmp/ansible-tmp-1726776633.0465884-9036-86385017328313/AnsiballZ_setup.py <<< 9036 1726776633.11799: stderr chunk (state=3): >>><<< 9036 1726776633.11806: stdout chunk (state=3): >>><<< 9036 1726776633.11823: done transferring module to remote 9036 1726776633.11834: _low_level_execute_command(): starting 9036 1726776633.11839: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776633.0465884-9036-86385017328313/ /root/.ansible/tmp/ansible-tmp-1726776633.0465884-9036-86385017328313/AnsiballZ_setup.py && sleep 0' 9036 1726776633.14161: stderr chunk (state=2): >>><<< 9036 1726776633.14168: stdout chunk (state=2): >>><<< 9036 1726776633.14180: _low_level_execute_command() done: rc=0, stdout=, stderr= 9036 1726776633.14185: _low_level_execute_command(): starting 9036 1726776633.14190: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776633.0465884-9036-86385017328313/AnsiballZ_setup.py && sleep 0' 9036 1726776633.42804: stdout chunk (state=2): >>> {"ansible_facts": {"ansible_pkg_mgr": "dnf"}, "invocation": {"module_args": {"filter": ["ansible_pkg_mgr"], "gather_subset": ["!all"], "gather_timeout": 10, "fact_path": "/etc/ansible/facts.d"}}} <<< 9036 1726776633.44239: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 9036 1726776633.44250: stdout chunk (state=3): >>><<< 9036 1726776633.44260: stderr chunk (state=3): >>><<< 9036 1726776633.44272: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_pkg_mgr": "dnf"}, "invocation": {"module_args": {"filter": ["ansible_pkg_mgr"], "gather_subset": ["!all"], "gather_timeout": 10, "fact_path": "/etc/ansible/facts.d"}}} , stderr=Shared connection to 10.31.8.186 closed. 9036 1726776633.44304: done with _execute_module (ansible.legacy.setup, {'filter': 'ansible_pkg_mgr', 'gather_subset': '!all', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776633.0465884-9036-86385017328313/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9036 1726776633.44324: Facts {'ansible_facts': {'ansible_pkg_mgr': 'dnf'}, 'invocation': {'module_args': {'filter': ['ansible_pkg_mgr'], 'gather_subset': ['!all'], 'gather_timeout': 10, 'fact_path': '/etc/ansible/facts.d'}}, '_ansible_parsed': True} 9036 1726776633.44387: variable 'ansible_module_compression' from source: unknown 9036 1726776633.44436: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 9036 1726776633.44471: variable 'ansible_facts' from source: unknown 9036 1726776633.44615: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776633.0465884-9036-86385017328313/AnsiballZ_dnf.py 9036 1726776633.45633: Sending initial data 9036 1726776633.45643: Sent initial data (149 bytes) 9036 1726776633.49298: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpyx1q8d_z /root/.ansible/tmp/ansible-tmp-1726776633.0465884-9036-86385017328313/AnsiballZ_dnf.py <<< 9036 1726776633.50740: stderr chunk (state=3): >>><<< 9036 1726776633.50753: stdout chunk (state=3): >>><<< 9036 1726776633.50779: done transferring module to remote 9036 1726776633.50790: _low_level_execute_command(): starting 9036 1726776633.50796: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776633.0465884-9036-86385017328313/ /root/.ansible/tmp/ansible-tmp-1726776633.0465884-9036-86385017328313/AnsiballZ_dnf.py && sleep 0' 9036 1726776633.53803: stderr chunk (state=2): >>><<< 9036 1726776633.53812: stdout chunk (state=2): >>><<< 9036 1726776633.53834: _low_level_execute_command() done: rc=0, stdout=, stderr= 9036 1726776633.53840: _low_level_execute_command(): starting 9036 1726776633.53845: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776633.0465884-9036-86385017328313/AnsiballZ_dnf.py && sleep 0' 9036 1726776636.04244: stdout chunk (state=2): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} <<< 9036 1726776636.11743: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 9036 1726776636.11793: stderr chunk (state=3): >>><<< 9036 1726776636.11802: stdout chunk (state=3): >>><<< 9036 1726776636.11820: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} , stderr=Shared connection to 10.31.8.186 closed. 9036 1726776636.11860: done with _execute_module (ansible.legacy.dnf, {'name': ['tuned', 'python3-configobj'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776633.0465884-9036-86385017328313/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9036 1726776636.11869: _low_level_execute_command(): starting 9036 1726776636.11876: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776633.0465884-9036-86385017328313/ > /dev/null 2>&1 && sleep 0' 9036 1726776636.14479: stderr chunk (state=2): >>><<< 9036 1726776636.14488: stdout chunk (state=2): >>><<< 9036 1726776636.14504: _low_level_execute_command() done: rc=0, stdout=, stderr= 9036 1726776636.14512: handler run complete 9036 1726776636.14554: attempt loop complete, returning result 9036 1726776636.14560: _execute() done 9036 1726776636.14564: dumping result to json 9036 1726776636.14569: done dumping result, returning 9036 1726776636.14577: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed [120fa90a-8a95-c4e4-06a7-0000000001f7] 9036 1726776636.14583: sending task result for task 120fa90a-8a95-c4e4-06a7-0000000001f7 9036 1726776636.14620: done sending task result for task 120fa90a-8a95-c4e4-06a7-0000000001f7 9036 1726776636.14624: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 8283 1726776636.15141: no more pending results, returning what we have 8283 1726776636.15145: results queue empty 8283 1726776636.15145: checking for any_errors_fatal 8283 1726776636.15149: done checking for any_errors_fatal 8283 1726776636.15150: checking for max_fail_percentage 8283 1726776636.15151: done checking for max_fail_percentage 8283 1726776636.15151: checking to see if all hosts have failed and the running result is not ok 8283 1726776636.15152: done checking to see if all hosts have failed 8283 1726776636.15153: getting the remaining hosts for this loop 8283 1726776636.15154: done getting the remaining hosts for this loop 8283 1726776636.15156: getting the next task for host managed_node3 8283 1726776636.15164: done getting next task for host managed_node3 8283 1726776636.15167: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8283 1726776636.15170: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776636.15180: getting variables 8283 1726776636.15181: in VariableManager get_vars() 8283 1726776636.15206: Calling all_inventory to load vars for managed_node3 8283 1726776636.15209: Calling groups_inventory to load vars for managed_node3 8283 1726776636.15211: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776636.15219: Calling all_plugins_play to load vars for managed_node3 8283 1726776636.15222: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776636.15225: Calling groups_plugins_play to load vars for managed_node3 8283 1726776636.15274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776636.15316: done with get_vars() 8283 1726776636.15323: done getting variables 8283 1726776636.15378: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:24 Thursday 19 September 2024 16:10:36 -0400 (0:00:03.165) 0:00:19.858 **** 8283 1726776636.15407: entering _queue_task() for managed_node3/debug 8283 1726776636.15594: worker is 1 (out of 1 available) 8283 1726776636.15606: exiting _queue_task() for managed_node3/debug 8283 1726776636.15616: done queuing things up, now waiting for results queue to drain 8283 1726776636.15618: waiting for pending results... 9188 1726776636.15831: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 9188 1726776636.15966: in run() - task 120fa90a-8a95-c4e4-06a7-0000000001f9 9188 1726776636.15983: variable 'ansible_search_path' from source: unknown 9188 1726776636.15987: variable 'ansible_search_path' from source: unknown 9188 1726776636.16017: calling self._execute() 9188 1726776636.16081: variable 'ansible_host' from source: host vars for 'managed_node3' 9188 1726776636.16090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9188 1726776636.16098: variable 'omit' from source: magic vars 9188 1726776636.16532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9188 1726776636.18885: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9188 1726776636.18986: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9188 1726776636.19022: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9188 1726776636.19058: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9188 1726776636.19085: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9188 1726776636.19157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9188 1726776636.19189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9188 1726776636.19221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9188 1726776636.19261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9188 1726776636.19275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9188 1726776636.19371: variable '__kernel_settings_is_transactional' from source: set_fact 9188 1726776636.19390: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 9188 1726776636.19395: when evaluation is False, skipping this task 9188 1726776636.19399: _execute() done 9188 1726776636.19402: dumping result to json 9188 1726776636.19406: done dumping result, returning 9188 1726776636.19411: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes [120fa90a-8a95-c4e4-06a7-0000000001f9] 9188 1726776636.19417: sending task result for task 120fa90a-8a95-c4e4-06a7-0000000001f9 9188 1726776636.19445: done sending task result for task 120fa90a-8a95-c4e4-06a7-0000000001f9 9188 1726776636.19448: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "__kernel_settings_is_transactional | d(false)" } 8283 1726776636.19788: no more pending results, returning what we have 8283 1726776636.19792: results queue empty 8283 1726776636.19793: checking for any_errors_fatal 8283 1726776636.19801: done checking for any_errors_fatal 8283 1726776636.19802: checking for max_fail_percentage 8283 1726776636.19803: done checking for max_fail_percentage 8283 1726776636.19804: checking to see if all hosts have failed and the running result is not ok 8283 1726776636.19805: done checking to see if all hosts have failed 8283 1726776636.19805: getting the remaining hosts for this loop 8283 1726776636.19806: done getting the remaining hosts for this loop 8283 1726776636.19810: getting the next task for host managed_node3 8283 1726776636.19817: done getting next task for host managed_node3 8283 1726776636.19820: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8283 1726776636.19823: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776636.19841: getting variables 8283 1726776636.19843: in VariableManager get_vars() 8283 1726776636.19874: Calling all_inventory to load vars for managed_node3 8283 1726776636.19877: Calling groups_inventory to load vars for managed_node3 8283 1726776636.19879: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776636.19892: Calling all_plugins_play to load vars for managed_node3 8283 1726776636.19896: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776636.19899: Calling groups_plugins_play to load vars for managed_node3 8283 1726776636.19951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776636.20001: done with get_vars() 8283 1726776636.20010: done getting variables 8283 1726776636.20067: Loading ActionModule 'reboot' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Reboot transactional update systems] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:29 Thursday 19 September 2024 16:10:36 -0400 (0:00:00.046) 0:00:19.905 **** 8283 1726776636.20102: entering _queue_task() for managed_node3/reboot 8283 1726776636.20298: worker is 1 (out of 1 available) 8283 1726776636.20312: exiting _queue_task() for managed_node3/reboot 8283 1726776636.20323: done queuing things up, now waiting for results queue to drain 8283 1726776636.20325: waiting for pending results... 9191 1726776636.20561: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 9191 1726776636.20694: in run() - task 120fa90a-8a95-c4e4-06a7-0000000001fa 9191 1726776636.20709: variable 'ansible_search_path' from source: unknown 9191 1726776636.20712: variable 'ansible_search_path' from source: unknown 9191 1726776636.20747: calling self._execute() 9191 1726776636.20810: variable 'ansible_host' from source: host vars for 'managed_node3' 9191 1726776636.20819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9191 1726776636.20835: variable 'omit' from source: magic vars 9191 1726776636.21281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9191 1726776636.24320: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9191 1726776636.24396: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9191 1726776636.24435: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9191 1726776636.24467: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9191 1726776636.24494: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9191 1726776636.24565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9191 1726776636.24597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9191 1726776636.24622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9191 1726776636.24662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9191 1726776636.24677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9191 1726776636.24773: variable '__kernel_settings_is_transactional' from source: set_fact 9191 1726776636.24793: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 9191 1726776636.24798: when evaluation is False, skipping this task 9191 1726776636.24801: _execute() done 9191 1726776636.24805: dumping result to json 9191 1726776636.24808: done dumping result, returning 9191 1726776636.24815: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems [120fa90a-8a95-c4e4-06a7-0000000001fa] 9191 1726776636.24822: sending task result for task 120fa90a-8a95-c4e4-06a7-0000000001fa 9191 1726776636.24854: done sending task result for task 120fa90a-8a95-c4e4-06a7-0000000001fa 9191 1726776636.24858: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8283 1726776636.25970: no more pending results, returning what we have 8283 1726776636.25974: results queue empty 8283 1726776636.25975: checking for any_errors_fatal 8283 1726776636.25979: done checking for any_errors_fatal 8283 1726776636.25980: checking for max_fail_percentage 8283 1726776636.25982: done checking for max_fail_percentage 8283 1726776636.25982: checking to see if all hosts have failed and the running result is not ok 8283 1726776636.25983: done checking to see if all hosts have failed 8283 1726776636.25984: getting the remaining hosts for this loop 8283 1726776636.25985: done getting the remaining hosts for this loop 8283 1726776636.25988: getting the next task for host managed_node3 8283 1726776636.25995: done getting next task for host managed_node3 8283 1726776636.25998: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8283 1726776636.26002: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776636.26015: getting variables 8283 1726776636.26016: in VariableManager get_vars() 8283 1726776636.26052: Calling all_inventory to load vars for managed_node3 8283 1726776636.26055: Calling groups_inventory to load vars for managed_node3 8283 1726776636.26057: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776636.26066: Calling all_plugins_play to load vars for managed_node3 8283 1726776636.26068: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776636.26071: Calling groups_plugins_play to load vars for managed_node3 8283 1726776636.26122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776636.26177: done with get_vars() 8283 1726776636.26185: done getting variables 8283 1726776636.26243: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:34 Thursday 19 September 2024 16:10:36 -0400 (0:00:00.061) 0:00:19.967 **** 8283 1726776636.26275: entering _queue_task() for managed_node3/fail 8283 1726776636.26480: worker is 1 (out of 1 available) 8283 1726776636.26493: exiting _queue_task() for managed_node3/fail 8283 1726776636.26505: done queuing things up, now waiting for results queue to drain 8283 1726776636.26506: waiting for pending results... 9198 1726776636.26846: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 9198 1726776636.26980: in run() - task 120fa90a-8a95-c4e4-06a7-0000000001fb 9198 1726776636.27000: variable 'ansible_search_path' from source: unknown 9198 1726776636.27005: variable 'ansible_search_path' from source: unknown 9198 1726776636.27039: calling self._execute() 9198 1726776636.27103: variable 'ansible_host' from source: host vars for 'managed_node3' 9198 1726776636.27113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9198 1726776636.27122: variable 'omit' from source: magic vars 9198 1726776636.27575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9198 1726776636.30625: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9198 1726776636.30693: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9198 1726776636.30731: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9198 1726776636.30766: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9198 1726776636.30793: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9198 1726776636.30870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9198 1726776636.30903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9198 1726776636.30928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9198 1726776636.30970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9198 1726776636.30988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9198 1726776636.31089: variable '__kernel_settings_is_transactional' from source: set_fact 9198 1726776636.31109: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 9198 1726776636.31113: when evaluation is False, skipping this task 9198 1726776636.31117: _execute() done 9198 1726776636.31120: dumping result to json 9198 1726776636.31123: done dumping result, returning 9198 1726776636.31131: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set [120fa90a-8a95-c4e4-06a7-0000000001fb] 9198 1726776636.31138: sending task result for task 120fa90a-8a95-c4e4-06a7-0000000001fb 9198 1726776636.31166: done sending task result for task 120fa90a-8a95-c4e4-06a7-0000000001fb 9198 1726776636.31169: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8283 1726776636.31670: no more pending results, returning what we have 8283 1726776636.31673: results queue empty 8283 1726776636.31674: checking for any_errors_fatal 8283 1726776636.31679: done checking for any_errors_fatal 8283 1726776636.31679: checking for max_fail_percentage 8283 1726776636.31681: done checking for max_fail_percentage 8283 1726776636.31681: checking to see if all hosts have failed and the running result is not ok 8283 1726776636.31682: done checking to see if all hosts have failed 8283 1726776636.31682: getting the remaining hosts for this loop 8283 1726776636.31684: done getting the remaining hosts for this loop 8283 1726776636.31690: getting the next task for host managed_node3 8283 1726776636.31698: done getting next task for host managed_node3 8283 1726776636.31702: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8283 1726776636.31705: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776636.31718: getting variables 8283 1726776636.31720: in VariableManager get_vars() 8283 1726776636.31753: Calling all_inventory to load vars for managed_node3 8283 1726776636.31756: Calling groups_inventory to load vars for managed_node3 8283 1726776636.31758: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776636.31764: Calling all_plugins_play to load vars for managed_node3 8283 1726776636.31766: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776636.31768: Calling groups_plugins_play to load vars for managed_node3 8283 1726776636.31805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776636.31843: done with get_vars() 8283 1726776636.31851: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Read tuned main config] ****** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42 Thursday 19 September 2024 16:10:36 -0400 (0:00:00.056) 0:00:20.024 **** 8283 1726776636.31937: entering _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config 8283 1726776636.32135: worker is 1 (out of 1 available) 8283 1726776636.32149: exiting _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config 8283 1726776636.32160: done queuing things up, now waiting for results queue to drain 8283 1726776636.32162: waiting for pending results... 9206 1726776636.32378: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 9206 1726776636.32509: in run() - task 120fa90a-8a95-c4e4-06a7-0000000001fd 9206 1726776636.32525: variable 'ansible_search_path' from source: unknown 9206 1726776636.32532: variable 'ansible_search_path' from source: unknown 9206 1726776636.32562: calling self._execute() 9206 1726776636.32624: variable 'ansible_host' from source: host vars for 'managed_node3' 9206 1726776636.32635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9206 1726776636.32643: variable 'omit' from source: magic vars 9206 1726776636.32737: variable 'omit' from source: magic vars 9206 1726776636.32790: variable 'omit' from source: magic vars 9206 1726776636.32816: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 9206 1726776636.33080: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 9206 1726776636.33162: variable '__kernel_settings_tuned_dir' from source: role '' all vars 9206 1726776636.33200: variable 'omit' from source: magic vars 9206 1726776636.33549: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9206 1726776636.33655: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9206 1726776636.33675: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9206 1726776636.33693: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9206 1726776636.33705: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9206 1726776636.33733: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9206 1726776636.33739: variable 'ansible_host' from source: host vars for 'managed_node3' 9206 1726776636.33744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9206 1726776636.33841: Set connection var ansible_module_compression to ZIP_DEFLATED 9206 1726776636.33850: Set connection var ansible_shell_type to sh 9206 1726776636.33856: Set connection var ansible_timeout to 10 9206 1726776636.33861: Set connection var ansible_connection to ssh 9206 1726776636.33868: Set connection var ansible_pipelining to False 9206 1726776636.33874: Set connection var ansible_shell_executable to /bin/sh 9206 1726776636.33896: variable 'ansible_shell_executable' from source: unknown 9206 1726776636.33901: variable 'ansible_connection' from source: unknown 9206 1726776636.33905: variable 'ansible_module_compression' from source: unknown 9206 1726776636.33908: variable 'ansible_shell_type' from source: unknown 9206 1726776636.33910: variable 'ansible_shell_executable' from source: unknown 9206 1726776636.33913: variable 'ansible_host' from source: host vars for 'managed_node3' 9206 1726776636.33917: variable 'ansible_pipelining' from source: unknown 9206 1726776636.33920: variable 'ansible_timeout' from source: unknown 9206 1726776636.33924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9206 1726776636.34090: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9206 1726776636.34101: variable 'omit' from source: magic vars 9206 1726776636.34107: starting attempt loop 9206 1726776636.34110: running the handler 9206 1726776636.34121: _low_level_execute_command(): starting 9206 1726776636.34130: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9206 1726776636.38235: stdout chunk (state=2): >>>/root <<< 9206 1726776636.38247: stderr chunk (state=2): >>><<< 9206 1726776636.38261: stdout chunk (state=3): >>><<< 9206 1726776636.38277: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9206 1726776636.38295: _low_level_execute_command(): starting 9206 1726776636.38303: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776636.3828893-9206-142247929382698 `" && echo ansible-tmp-1726776636.3828893-9206-142247929382698="` echo /root/.ansible/tmp/ansible-tmp-1726776636.3828893-9206-142247929382698 `" ) && sleep 0' 9206 1726776636.41834: stdout chunk (state=2): >>>ansible-tmp-1726776636.3828893-9206-142247929382698=/root/.ansible/tmp/ansible-tmp-1726776636.3828893-9206-142247929382698 <<< 9206 1726776636.41843: stderr chunk (state=2): >>><<< 9206 1726776636.41853: stdout chunk (state=3): >>><<< 9206 1726776636.41866: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776636.3828893-9206-142247929382698=/root/.ansible/tmp/ansible-tmp-1726776636.3828893-9206-142247929382698 , stderr= 9206 1726776636.41911: variable 'ansible_module_compression' from source: unknown 9206 1726776636.41948: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 9206 1726776636.41984: variable 'ansible_facts' from source: unknown 9206 1726776636.42089: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776636.3828893-9206-142247929382698/AnsiballZ_kernel_settings_get_config.py 9206 1726776636.42567: Sending initial data 9206 1726776636.42574: Sent initial data (173 bytes) 9206 1726776636.47330: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmprshe9w1j /root/.ansible/tmp/ansible-tmp-1726776636.3828893-9206-142247929382698/AnsiballZ_kernel_settings_get_config.py <<< 9206 1726776636.48722: stderr chunk (state=3): >>><<< 9206 1726776636.48734: stdout chunk (state=3): >>><<< 9206 1726776636.48759: done transferring module to remote 9206 1726776636.48771: _low_level_execute_command(): starting 9206 1726776636.48776: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776636.3828893-9206-142247929382698/ /root/.ansible/tmp/ansible-tmp-1726776636.3828893-9206-142247929382698/AnsiballZ_kernel_settings_get_config.py && sleep 0' 9206 1726776636.52635: stderr chunk (state=2): >>><<< 9206 1726776636.52646: stdout chunk (state=2): >>><<< 9206 1726776636.52662: _low_level_execute_command() done: rc=0, stdout=, stderr= 9206 1726776636.52667: _low_level_execute_command(): starting 9206 1726776636.52673: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776636.3828893-9206-142247929382698/AnsiballZ_kernel_settings_get_config.py && sleep 0' 9206 1726776636.68467: stdout chunk (state=2): >>> {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} <<< 9206 1726776636.69553: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 9206 1726776636.69566: stdout chunk (state=3): >>><<< 9206 1726776636.69578: stderr chunk (state=3): >>><<< 9206 1726776636.69592: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} , stderr=Shared connection to 10.31.8.186 closed. 9206 1726776636.69623: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/tuned-main.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776636.3828893-9206-142247929382698/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9206 1726776636.69637: _low_level_execute_command(): starting 9206 1726776636.69643: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776636.3828893-9206-142247929382698/ > /dev/null 2>&1 && sleep 0' 9206 1726776636.72807: stderr chunk (state=2): >>><<< 9206 1726776636.72816: stdout chunk (state=2): >>><<< 9206 1726776636.72832: _low_level_execute_command() done: rc=0, stdout=, stderr= 9206 1726776636.72839: handler run complete 9206 1726776636.72859: attempt loop complete, returning result 9206 1726776636.72864: _execute() done 9206 1726776636.72868: dumping result to json 9206 1726776636.72873: done dumping result, returning 9206 1726776636.72880: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config [120fa90a-8a95-c4e4-06a7-0000000001fd] 9206 1726776636.72890: sending task result for task 120fa90a-8a95-c4e4-06a7-0000000001fd 9206 1726776636.72924: done sending task result for task 120fa90a-8a95-c4e4-06a7-0000000001fd 9206 1726776636.72928: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "data": { "daemon": "1", "default_instance_priority": "0", "dynamic_tuning": "0", "log_file_count": "2", "log_file_max_size": "1MB", "reapply_sysctl": "1", "recommend_command": "1", "sleep_interval": "1", "udev_buffer_size": "1MB", "update_interval": "10" } } 8283 1726776636.73335: no more pending results, returning what we have 8283 1726776636.73339: results queue empty 8283 1726776636.73339: checking for any_errors_fatal 8283 1726776636.73344: done checking for any_errors_fatal 8283 1726776636.73345: checking for max_fail_percentage 8283 1726776636.73346: done checking for max_fail_percentage 8283 1726776636.73347: checking to see if all hosts have failed and the running result is not ok 8283 1726776636.73347: done checking to see if all hosts have failed 8283 1726776636.73348: getting the remaining hosts for this loop 8283 1726776636.73349: done getting the remaining hosts for this loop 8283 1726776636.73352: getting the next task for host managed_node3 8283 1726776636.73359: done getting next task for host managed_node3 8283 1726776636.73362: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8283 1726776636.73367: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776636.73377: getting variables 8283 1726776636.73379: in VariableManager get_vars() 8283 1726776636.73412: Calling all_inventory to load vars for managed_node3 8283 1726776636.73415: Calling groups_inventory to load vars for managed_node3 8283 1726776636.73417: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776636.73426: Calling all_plugins_play to load vars for managed_node3 8283 1726776636.73430: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776636.73434: Calling groups_plugins_play to load vars for managed_node3 8283 1726776636.73483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776636.73536: done with get_vars() 8283 1726776636.73545: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 Thursday 19 September 2024 16:10:36 -0400 (0:00:00.416) 0:00:20.441 **** 8283 1726776636.73638: entering _queue_task() for managed_node3/stat 8283 1726776636.73832: worker is 1 (out of 1 available) 8283 1726776636.73845: exiting _queue_task() for managed_node3/stat 8283 1726776636.73857: done queuing things up, now waiting for results queue to drain 8283 1726776636.73859: waiting for pending results... 9240 1726776636.74150: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 9240 1726776636.74284: in run() - task 120fa90a-8a95-c4e4-06a7-0000000001fe 9240 1726776636.74305: variable 'ansible_search_path' from source: unknown 9240 1726776636.74309: variable 'ansible_search_path' from source: unknown 9240 1726776636.74352: variable '__prof_from_conf' from source: task vars 9240 1726776636.74638: variable '__prof_from_conf' from source: task vars 9240 1726776636.74913: variable '__data' from source: task vars 9240 1726776636.74991: variable '__kernel_settings_register_tuned_main' from source: set_fact 9240 1726776636.75213: variable '__kernel_settings_tuned_dir' from source: role '' all vars 9240 1726776636.75224: variable '__kernel_settings_tuned_dir' from source: role '' all vars 9240 1726776636.75283: variable '__kernel_settings_tuned_dir' from source: role '' all vars 9240 1726776636.75304: variable 'omit' from source: magic vars 9240 1726776636.75391: variable 'ansible_host' from source: host vars for 'managed_node3' 9240 1726776636.75402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9240 1726776636.75411: variable 'omit' from source: magic vars 9240 1726776636.75651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9240 1726776636.78060: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9240 1726776636.78136: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9240 1726776636.78171: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9240 1726776636.78214: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9240 1726776636.78247: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9240 1726776636.78317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9240 1726776636.78382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9240 1726776636.78412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9240 1726776636.78454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9240 1726776636.78470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9240 1726776636.78561: variable 'item' from source: unknown 9240 1726776636.78575: Evaluated conditional (item | length > 0): False 9240 1726776636.78580: when evaluation is False, skipping this task 9240 1726776636.78615: variable 'item' from source: unknown 9240 1726776636.78680: variable 'item' from source: unknown skipping: [managed_node3] => (item=) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item | length > 0", "item": "", "skip_reason": "Conditional result was False" } 9240 1726776636.78760: variable 'ansible_host' from source: host vars for 'managed_node3' 9240 1726776636.78770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9240 1726776636.78779: variable 'omit' from source: magic vars 9240 1726776636.78924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9240 1726776636.78951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9240 1726776636.78976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9240 1726776636.79017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9240 1726776636.79032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9240 1726776636.79106: variable 'item' from source: unknown 9240 1726776636.79116: Evaluated conditional (item | length > 0): True 9240 1726776636.79122: variable 'omit' from source: magic vars 9240 1726776636.79168: variable 'omit' from source: magic vars 9240 1726776636.79215: variable 'item' from source: unknown 9240 1726776636.79276: variable 'item' from source: unknown 9240 1726776636.79296: variable 'omit' from source: magic vars 9240 1726776636.79325: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9240 1726776636.79354: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9240 1726776636.79371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9240 1726776636.79391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9240 1726776636.79402: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9240 1726776636.79630: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9240 1726776636.79636: variable 'ansible_host' from source: host vars for 'managed_node3' 9240 1726776636.79640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9240 1726776636.79732: Set connection var ansible_module_compression to ZIP_DEFLATED 9240 1726776636.79742: Set connection var ansible_shell_type to sh 9240 1726776636.79748: Set connection var ansible_timeout to 10 9240 1726776636.79754: Set connection var ansible_connection to ssh 9240 1726776636.79760: Set connection var ansible_pipelining to False 9240 1726776636.79765: Set connection var ansible_shell_executable to /bin/sh 9240 1726776636.79782: variable 'ansible_shell_executable' from source: unknown 9240 1726776636.79788: variable 'ansible_connection' from source: unknown 9240 1726776636.79792: variable 'ansible_module_compression' from source: unknown 9240 1726776636.79795: variable 'ansible_shell_type' from source: unknown 9240 1726776636.79798: variable 'ansible_shell_executable' from source: unknown 9240 1726776636.79801: variable 'ansible_host' from source: host vars for 'managed_node3' 9240 1726776636.79804: variable 'ansible_pipelining' from source: unknown 9240 1726776636.79807: variable 'ansible_timeout' from source: unknown 9240 1726776636.79811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9240 1726776636.79940: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9240 1726776636.79950: variable 'omit' from source: magic vars 9240 1726776636.79954: starting attempt loop 9240 1726776636.79957: running the handler 9240 1726776636.79968: _low_level_execute_command(): starting 9240 1726776636.79975: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9240 1726776636.83284: stdout chunk (state=2): >>>/root <<< 9240 1726776636.83424: stderr chunk (state=3): >>><<< 9240 1726776636.83434: stdout chunk (state=3): >>><<< 9240 1726776636.83455: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9240 1726776636.83468: _low_level_execute_command(): starting 9240 1726776636.83474: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776636.834639-9240-12297391346644 `" && echo ansible-tmp-1726776636.834639-9240-12297391346644="` echo /root/.ansible/tmp/ansible-tmp-1726776636.834639-9240-12297391346644 `" ) && sleep 0' 9240 1726776636.86966: stdout chunk (state=2): >>>ansible-tmp-1726776636.834639-9240-12297391346644=/root/.ansible/tmp/ansible-tmp-1726776636.834639-9240-12297391346644 <<< 9240 1726776636.87038: stderr chunk (state=3): >>><<< 9240 1726776636.87047: stdout chunk (state=3): >>><<< 9240 1726776636.87062: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776636.834639-9240-12297391346644=/root/.ansible/tmp/ansible-tmp-1726776636.834639-9240-12297391346644 , stderr= 9240 1726776636.87102: variable 'ansible_module_compression' from source: unknown 9240 1726776636.87156: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 9240 1726776636.87188: variable 'ansible_facts' from source: unknown 9240 1726776636.87282: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776636.834639-9240-12297391346644/AnsiballZ_stat.py 9240 1726776636.88871: Sending initial data 9240 1726776636.88878: Sent initial data (149 bytes) 9240 1726776636.97047: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpst8ncc8d /root/.ansible/tmp/ansible-tmp-1726776636.834639-9240-12297391346644/AnsiballZ_stat.py <<< 9240 1726776637.01538: stderr chunk (state=3): >>><<< 9240 1726776637.01549: stdout chunk (state=3): >>><<< 9240 1726776637.01572: done transferring module to remote 9240 1726776637.01585: _low_level_execute_command(): starting 9240 1726776637.01591: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776636.834639-9240-12297391346644/ /root/.ansible/tmp/ansible-tmp-1726776636.834639-9240-12297391346644/AnsiballZ_stat.py && sleep 0' 9240 1726776637.04696: stderr chunk (state=2): >>><<< 9240 1726776637.04705: stdout chunk (state=2): >>><<< 9240 1726776637.04721: _low_level_execute_command() done: rc=0, stdout=, stderr= 9240 1726776637.04726: _low_level_execute_command(): starting 9240 1726776637.04734: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776636.834639-9240-12297391346644/AnsiballZ_stat.py && sleep 0' 9240 1726776637.20390: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 9240 1726776637.21443: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 9240 1726776637.21485: stderr chunk (state=3): >>><<< 9240 1726776637.21495: stdout chunk (state=3): >>><<< 9240 1726776637.21509: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.8.186 closed. 9240 1726776637.21533: done with _execute_module (stat, {'path': '/etc/tuned/profiles', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776636.834639-9240-12297391346644/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9240 1726776637.21543: _low_level_execute_command(): starting 9240 1726776637.21548: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776636.834639-9240-12297391346644/ > /dev/null 2>&1 && sleep 0' 9240 1726776637.23918: stderr chunk (state=2): >>><<< 9240 1726776637.23924: stdout chunk (state=2): >>><<< 9240 1726776637.23938: _low_level_execute_command() done: rc=0, stdout=, stderr= 9240 1726776637.23944: handler run complete 9240 1726776637.23959: attempt loop complete, returning result 9240 1726776637.23974: variable 'item' from source: unknown 9240 1726776637.24040: variable 'item' from source: unknown ok: [managed_node3] => (item=/etc/tuned/profiles) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned/profiles", "stat": { "exists": false } } 9240 1726776637.24127: variable 'ansible_host' from source: host vars for 'managed_node3' 9240 1726776637.24139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9240 1726776637.24149: variable 'omit' from source: magic vars 9240 1726776637.24255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9240 1726776637.24278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9240 1726776637.24299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9240 1726776637.24325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9240 1726776637.24338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9240 1726776637.24398: variable 'item' from source: unknown 9240 1726776637.24407: Evaluated conditional (item | length > 0): True 9240 1726776637.24412: variable 'omit' from source: magic vars 9240 1726776637.24424: variable 'omit' from source: magic vars 9240 1726776637.24455: variable 'item' from source: unknown 9240 1726776637.24502: variable 'item' from source: unknown 9240 1726776637.24516: variable 'omit' from source: magic vars 9240 1726776637.24535: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9240 1726776637.24545: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9240 1726776637.24551: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9240 1726776637.24563: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9240 1726776637.24567: variable 'ansible_host' from source: host vars for 'managed_node3' 9240 1726776637.24571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9240 1726776637.24622: Set connection var ansible_module_compression to ZIP_DEFLATED 9240 1726776637.24630: Set connection var ansible_shell_type to sh 9240 1726776637.24637: Set connection var ansible_timeout to 10 9240 1726776637.24642: Set connection var ansible_connection to ssh 9240 1726776637.24649: Set connection var ansible_pipelining to False 9240 1726776637.24653: Set connection var ansible_shell_executable to /bin/sh 9240 1726776637.24667: variable 'ansible_shell_executable' from source: unknown 9240 1726776637.24670: variable 'ansible_connection' from source: unknown 9240 1726776637.24673: variable 'ansible_module_compression' from source: unknown 9240 1726776637.24676: variable 'ansible_shell_type' from source: unknown 9240 1726776637.24679: variable 'ansible_shell_executable' from source: unknown 9240 1726776637.24683: variable 'ansible_host' from source: host vars for 'managed_node3' 9240 1726776637.24689: variable 'ansible_pipelining' from source: unknown 9240 1726776637.24692: variable 'ansible_timeout' from source: unknown 9240 1726776637.24697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9240 1726776637.24765: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9240 1726776637.24776: variable 'omit' from source: magic vars 9240 1726776637.24781: starting attempt loop 9240 1726776637.24785: running the handler 9240 1726776637.24791: _low_level_execute_command(): starting 9240 1726776637.24796: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9240 1726776637.26903: stdout chunk (state=2): >>>/root <<< 9240 1726776637.27021: stderr chunk (state=3): >>><<< 9240 1726776637.27027: stdout chunk (state=3): >>><<< 9240 1726776637.27040: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9240 1726776637.27050: _low_level_execute_command(): starting 9240 1726776637.27055: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776637.2704608-9240-251704527262771 `" && echo ansible-tmp-1726776637.2704608-9240-251704527262771="` echo /root/.ansible/tmp/ansible-tmp-1726776637.2704608-9240-251704527262771 `" ) && sleep 0' 9240 1726776637.29417: stdout chunk (state=2): >>>ansible-tmp-1726776637.2704608-9240-251704527262771=/root/.ansible/tmp/ansible-tmp-1726776637.2704608-9240-251704527262771 <<< 9240 1726776637.29540: stderr chunk (state=3): >>><<< 9240 1726776637.29546: stdout chunk (state=3): >>><<< 9240 1726776637.29557: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776637.2704608-9240-251704527262771=/root/.ansible/tmp/ansible-tmp-1726776637.2704608-9240-251704527262771 , stderr= 9240 1726776637.29581: variable 'ansible_module_compression' from source: unknown 9240 1726776637.29615: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 9240 1726776637.29634: variable 'ansible_facts' from source: unknown 9240 1726776637.29688: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776637.2704608-9240-251704527262771/AnsiballZ_stat.py 9240 1726776637.29766: Sending initial data 9240 1726776637.29775: Sent initial data (151 bytes) 9240 1726776637.32171: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpi9looo77 /root/.ansible/tmp/ansible-tmp-1726776637.2704608-9240-251704527262771/AnsiballZ_stat.py <<< 9240 1726776637.33143: stderr chunk (state=3): >>><<< 9240 1726776637.33149: stdout chunk (state=3): >>><<< 9240 1726776637.33164: done transferring module to remote 9240 1726776637.33172: _low_level_execute_command(): starting 9240 1726776637.33177: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776637.2704608-9240-251704527262771/ /root/.ansible/tmp/ansible-tmp-1726776637.2704608-9240-251704527262771/AnsiballZ_stat.py && sleep 0' 9240 1726776637.35487: stderr chunk (state=2): >>><<< 9240 1726776637.35497: stdout chunk (state=2): >>><<< 9240 1726776637.35511: _low_level_execute_command() done: rc=0, stdout=, stderr= 9240 1726776637.35516: _low_level_execute_command(): starting 9240 1726776637.35520: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776637.2704608-9240-251704527262771/AnsiballZ_stat.py && sleep 0' 9240 1726776637.51235: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726776631.537892, "mtime": 1726776629.3928645, "ctime": 1726776629.3928645, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 9240 1726776637.52361: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 9240 1726776637.52409: stderr chunk (state=3): >>><<< 9240 1726776637.52416: stdout chunk (state=3): >>><<< 9240 1726776637.52434: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726776631.537892, "mtime": 1726776629.3928645, "ctime": 1726776629.3928645, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.8.186 closed. 9240 1726776637.52469: done with _execute_module (stat, {'path': '/etc/tuned', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776637.2704608-9240-251704527262771/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9240 1726776637.52477: _low_level_execute_command(): starting 9240 1726776637.52483: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776637.2704608-9240-251704527262771/ > /dev/null 2>&1 && sleep 0' 9240 1726776637.54832: stderr chunk (state=2): >>><<< 9240 1726776637.54842: stdout chunk (state=2): >>><<< 9240 1726776637.54855: _low_level_execute_command() done: rc=0, stdout=, stderr= 9240 1726776637.54861: handler run complete 9240 1726776637.54897: attempt loop complete, returning result 9240 1726776637.54913: variable 'item' from source: unknown 9240 1726776637.54974: variable 'item' from source: unknown ok: [managed_node3] => (item=/etc/tuned) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned", "stat": { "atime": 1726776631.537892, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1726776629.3928645, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 917919, "isblk": false, "ischr": false, "isdir": true, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/directory", "mode": "0755", "mtime": 1726776629.3928645, "nlink": 4, "path": "/etc/tuned", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 159, "uid": 0, "version": "1785990601", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 9240 1726776637.55021: dumping result to json 9240 1726776637.55032: done dumping result, returning 9240 1726776637.55041: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory [120fa90a-8a95-c4e4-06a7-0000000001fe] 9240 1726776637.55047: sending task result for task 120fa90a-8a95-c4e4-06a7-0000000001fe 9240 1726776637.55087: done sending task result for task 120fa90a-8a95-c4e4-06a7-0000000001fe 9240 1726776637.55093: WORKER PROCESS EXITING 8283 1726776637.55261: no more pending results, returning what we have 8283 1726776637.55264: results queue empty 8283 1726776637.55265: checking for any_errors_fatal 8283 1726776637.55269: done checking for any_errors_fatal 8283 1726776637.55269: checking for max_fail_percentage 8283 1726776637.55270: done checking for max_fail_percentage 8283 1726776637.55271: checking to see if all hosts have failed and the running result is not ok 8283 1726776637.55272: done checking to see if all hosts have failed 8283 1726776637.55272: getting the remaining hosts for this loop 8283 1726776637.55273: done getting the remaining hosts for this loop 8283 1726776637.55276: getting the next task for host managed_node3 8283 1726776637.55281: done getting next task for host managed_node3 8283 1726776637.55284: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8283 1726776637.55287: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776637.55297: getting variables 8283 1726776637.55298: in VariableManager get_vars() 8283 1726776637.55330: Calling all_inventory to load vars for managed_node3 8283 1726776637.55333: Calling groups_inventory to load vars for managed_node3 8283 1726776637.55335: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776637.55342: Calling all_plugins_play to load vars for managed_node3 8283 1726776637.55345: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776637.55348: Calling groups_plugins_play to load vars for managed_node3 8283 1726776637.55383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776637.55413: done with get_vars() 8283 1726776637.55419: done getting variables 8283 1726776637.55460: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:63 Thursday 19 September 2024 16:10:37 -0400 (0:00:00.818) 0:00:21.259 **** 8283 1726776637.55482: entering _queue_task() for managed_node3/set_fact 8283 1726776637.55642: worker is 1 (out of 1 available) 8283 1726776637.55655: exiting _queue_task() for managed_node3/set_fact 8283 1726776637.55665: done queuing things up, now waiting for results queue to drain 8283 1726776637.55666: waiting for pending results... 9289 1726776637.55785: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 9289 1726776637.55894: in run() - task 120fa90a-8a95-c4e4-06a7-0000000001ff 9289 1726776637.55911: variable 'ansible_search_path' from source: unknown 9289 1726776637.55915: variable 'ansible_search_path' from source: unknown 9289 1726776637.55943: calling self._execute() 9289 1726776637.55992: variable 'ansible_host' from source: host vars for 'managed_node3' 9289 1726776637.56002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9289 1726776637.56010: variable 'omit' from source: magic vars 9289 1726776637.56076: variable 'omit' from source: magic vars 9289 1726776637.56114: variable 'omit' from source: magic vars 9289 1726776637.56605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9289 1726776637.58751: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9289 1726776637.58803: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9289 1726776637.58833: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9289 1726776637.58862: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9289 1726776637.58883: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9289 1726776637.58944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9289 1726776637.58967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9289 1726776637.58983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9289 1726776637.59010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9289 1726776637.59019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9289 1726776637.59054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9289 1726776637.59069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9289 1726776637.59083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9289 1726776637.59108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9289 1726776637.59116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9289 1726776637.59165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9289 1726776637.59181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9289 1726776637.59198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9289 1726776637.59221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9289 1726776637.59232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9289 1726776637.59390: variable '__kernel_settings_find_profile_dirs' from source: set_fact 9289 1726776637.59458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9289 1726776637.59567: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9289 1726776637.59598: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9289 1726776637.59621: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9289 1726776637.59645: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9289 1726776637.59674: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9289 1726776637.59694: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9289 1726776637.59715: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9289 1726776637.59734: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9289 1726776637.59771: variable 'omit' from source: magic vars 9289 1726776637.59795: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9289 1726776637.59816: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9289 1726776637.59833: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9289 1726776637.59846: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9289 1726776637.59856: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9289 1726776637.59879: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9289 1726776637.59884: variable 'ansible_host' from source: host vars for 'managed_node3' 9289 1726776637.59890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9289 1726776637.59957: Set connection var ansible_module_compression to ZIP_DEFLATED 9289 1726776637.59965: Set connection var ansible_shell_type to sh 9289 1726776637.59971: Set connection var ansible_timeout to 10 9289 1726776637.59974: Set connection var ansible_connection to ssh 9289 1726776637.59979: Set connection var ansible_pipelining to False 9289 1726776637.59982: Set connection var ansible_shell_executable to /bin/sh 9289 1726776637.59999: variable 'ansible_shell_executable' from source: unknown 9289 1726776637.60002: variable 'ansible_connection' from source: unknown 9289 1726776637.60004: variable 'ansible_module_compression' from source: unknown 9289 1726776637.60006: variable 'ansible_shell_type' from source: unknown 9289 1726776637.60008: variable 'ansible_shell_executable' from source: unknown 9289 1726776637.60009: variable 'ansible_host' from source: host vars for 'managed_node3' 9289 1726776637.60013: variable 'ansible_pipelining' from source: unknown 9289 1726776637.60015: variable 'ansible_timeout' from source: unknown 9289 1726776637.60017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9289 1726776637.60092: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9289 1726776637.60101: variable 'omit' from source: magic vars 9289 1726776637.60104: starting attempt loop 9289 1726776637.60107: running the handler 9289 1726776637.60113: handler run complete 9289 1726776637.60119: attempt loop complete, returning result 9289 1726776637.60122: _execute() done 9289 1726776637.60124: dumping result to json 9289 1726776637.60126: done dumping result, returning 9289 1726776637.60132: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir [120fa90a-8a95-c4e4-06a7-0000000001ff] 9289 1726776637.60137: sending task result for task 120fa90a-8a95-c4e4-06a7-0000000001ff 9289 1726776637.60154: done sending task result for task 120fa90a-8a95-c4e4-06a7-0000000001ff 9289 1726776637.60156: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__kernel_settings_profile_parent": "/etc/tuned" }, "changed": false } 8283 1726776637.60496: no more pending results, returning what we have 8283 1726776637.60499: results queue empty 8283 1726776637.60500: checking for any_errors_fatal 8283 1726776637.60508: done checking for any_errors_fatal 8283 1726776637.60509: checking for max_fail_percentage 8283 1726776637.60510: done checking for max_fail_percentage 8283 1726776637.60511: checking to see if all hosts have failed and the running result is not ok 8283 1726776637.60511: done checking to see if all hosts have failed 8283 1726776637.60512: getting the remaining hosts for this loop 8283 1726776637.60514: done getting the remaining hosts for this loop 8283 1726776637.60516: getting the next task for host managed_node3 8283 1726776637.60523: done getting next task for host managed_node3 8283 1726776637.60526: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8283 1726776637.60531: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776637.60541: getting variables 8283 1726776637.60543: in VariableManager get_vars() 8283 1726776637.60578: Calling all_inventory to load vars for managed_node3 8283 1726776637.60581: Calling groups_inventory to load vars for managed_node3 8283 1726776637.60583: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776637.60592: Calling all_plugins_play to load vars for managed_node3 8283 1726776637.60594: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776637.60597: Calling groups_plugins_play to load vars for managed_node3 8283 1726776637.60648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776637.60695: done with get_vars() 8283 1726776637.60703: done getting variables 8283 1726776637.60755: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 Thursday 19 September 2024 16:10:37 -0400 (0:00:00.052) 0:00:21.312 **** 8283 1726776637.60786: entering _queue_task() for managed_node3/service 8283 1726776637.60973: worker is 1 (out of 1 available) 8283 1726776637.60986: exiting _queue_task() for managed_node3/service 8283 1726776637.60998: done queuing things up, now waiting for results queue to drain 8283 1726776637.60999: waiting for pending results... 9292 1726776637.61188: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 9292 1726776637.61288: in run() - task 120fa90a-8a95-c4e4-06a7-000000000200 9292 1726776637.61302: variable 'ansible_search_path' from source: unknown 9292 1726776637.61305: variable 'ansible_search_path' from source: unknown 9292 1726776637.61334: variable '__kernel_settings_services' from source: include_vars 9292 1726776637.61548: variable '__kernel_settings_services' from source: include_vars 9292 1726776637.61694: variable 'omit' from source: magic vars 9292 1726776637.61756: variable 'ansible_host' from source: host vars for 'managed_node3' 9292 1726776637.61766: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9292 1726776637.61776: variable 'omit' from source: magic vars 9292 1726776637.61827: variable 'omit' from source: magic vars 9292 1726776637.61862: variable 'omit' from source: magic vars 9292 1726776637.61896: variable 'item' from source: unknown 9292 1726776637.61948: variable 'item' from source: unknown 9292 1726776637.61967: variable 'omit' from source: magic vars 9292 1726776637.62000: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9292 1726776637.62026: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9292 1726776637.62045: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9292 1726776637.62058: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9292 1726776637.62068: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9292 1726776637.62090: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9292 1726776637.62095: variable 'ansible_host' from source: host vars for 'managed_node3' 9292 1726776637.62100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9292 1726776637.62165: Set connection var ansible_module_compression to ZIP_DEFLATED 9292 1726776637.62173: Set connection var ansible_shell_type to sh 9292 1726776637.62179: Set connection var ansible_timeout to 10 9292 1726776637.62185: Set connection var ansible_connection to ssh 9292 1726776637.62192: Set connection var ansible_pipelining to False 9292 1726776637.62197: Set connection var ansible_shell_executable to /bin/sh 9292 1726776637.62212: variable 'ansible_shell_executable' from source: unknown 9292 1726776637.62216: variable 'ansible_connection' from source: unknown 9292 1726776637.62219: variable 'ansible_module_compression' from source: unknown 9292 1726776637.62223: variable 'ansible_shell_type' from source: unknown 9292 1726776637.62226: variable 'ansible_shell_executable' from source: unknown 9292 1726776637.62228: variable 'ansible_host' from source: host vars for 'managed_node3' 9292 1726776637.62233: variable 'ansible_pipelining' from source: unknown 9292 1726776637.62236: variable 'ansible_timeout' from source: unknown 9292 1726776637.62241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9292 1726776637.62323: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9292 1726776637.62335: variable 'omit' from source: magic vars 9292 1726776637.62341: starting attempt loop 9292 1726776637.62344: running the handler 9292 1726776637.62400: variable 'ansible_facts' from source: unknown 9292 1726776637.62430: _low_level_execute_command(): starting 9292 1726776637.62438: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9292 1726776637.64714: stdout chunk (state=2): >>>/root <<< 9292 1726776637.64840: stderr chunk (state=3): >>><<< 9292 1726776637.64847: stdout chunk (state=3): >>><<< 9292 1726776637.64864: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9292 1726776637.64876: _low_level_execute_command(): starting 9292 1726776637.64881: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776637.6487093-9292-94765016013145 `" && echo ansible-tmp-1726776637.6487093-9292-94765016013145="` echo /root/.ansible/tmp/ansible-tmp-1726776637.6487093-9292-94765016013145 `" ) && sleep 0' 9292 1726776637.67286: stdout chunk (state=2): >>>ansible-tmp-1726776637.6487093-9292-94765016013145=/root/.ansible/tmp/ansible-tmp-1726776637.6487093-9292-94765016013145 <<< 9292 1726776637.67411: stderr chunk (state=3): >>><<< 9292 1726776637.67418: stdout chunk (state=3): >>><<< 9292 1726776637.67432: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776637.6487093-9292-94765016013145=/root/.ansible/tmp/ansible-tmp-1726776637.6487093-9292-94765016013145 , stderr= 9292 1726776637.67453: variable 'ansible_module_compression' from source: unknown 9292 1726776637.67491: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 9292 1726776637.67536: variable 'ansible_facts' from source: unknown 9292 1726776637.67685: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776637.6487093-9292-94765016013145/AnsiballZ_setup.py 9292 1726776637.67784: Sending initial data 9292 1726776637.67791: Sent initial data (151 bytes) 9292 1726776637.70242: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpskni9lln /root/.ansible/tmp/ansible-tmp-1726776637.6487093-9292-94765016013145/AnsiballZ_setup.py <<< 9292 1726776637.72021: stderr chunk (state=3): >>><<< 9292 1726776637.72028: stdout chunk (state=3): >>><<< 9292 1726776637.72049: done transferring module to remote 9292 1726776637.72061: _low_level_execute_command(): starting 9292 1726776637.72066: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776637.6487093-9292-94765016013145/ /root/.ansible/tmp/ansible-tmp-1726776637.6487093-9292-94765016013145/AnsiballZ_setup.py && sleep 0' 9292 1726776637.74422: stderr chunk (state=2): >>><<< 9292 1726776637.74432: stdout chunk (state=2): >>><<< 9292 1726776637.74446: _low_level_execute_command() done: rc=0, stdout=, stderr= 9292 1726776637.74450: _low_level_execute_command(): starting 9292 1726776637.74455: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776637.6487093-9292-94765016013145/AnsiballZ_setup.py && sleep 0' 9292 1726776638.02135: stdout chunk (state=2): >>> {"ansible_facts": {"ansible_service_mgr": "systemd"}, "invocation": {"module_args": {"gather_subset": ["!all"], "filter": ["ansible_service_mgr"], "gather_timeout": 10, "fact_path": "/etc/ansible/facts.d"}}} <<< 9292 1726776638.04107: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 9292 1726776638.04120: stdout chunk (state=3): >>><<< 9292 1726776638.04133: stderr chunk (state=3): >>><<< 9292 1726776638.04147: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_service_mgr": "systemd"}, "invocation": {"module_args": {"gather_subset": ["!all"], "filter": ["ansible_service_mgr"], "gather_timeout": 10, "fact_path": "/etc/ansible/facts.d"}}} , stderr=Shared connection to 10.31.8.186 closed. 9292 1726776638.04181: done with _execute_module (ansible.legacy.setup, {'gather_subset': '!all', 'filter': 'ansible_service_mgr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776637.6487093-9292-94765016013145/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9292 1726776638.04205: Facts {'ansible_facts': {'ansible_service_mgr': 'systemd'}, 'invocation': {'module_args': {'gather_subset': ['!all'], 'filter': ['ansible_service_mgr'], 'gather_timeout': 10, 'fact_path': '/etc/ansible/facts.d'}}, '_ansible_parsed': True} 9292 1726776638.04272: variable 'ansible_module_compression' from source: unknown 9292 1726776638.04323: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 9292 1726776638.04377: variable 'ansible_facts' from source: unknown 9292 1726776638.04625: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776637.6487093-9292-94765016013145/AnsiballZ_systemd.py 9292 1726776638.05114: Sending initial data 9292 1726776638.05120: Sent initial data (153 bytes) 9292 1726776638.08035: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpju_qmlbi /root/.ansible/tmp/ansible-tmp-1726776637.6487093-9292-94765016013145/AnsiballZ_systemd.py <<< 9292 1726776638.10456: stderr chunk (state=3): >>><<< 9292 1726776638.10463: stdout chunk (state=3): >>><<< 9292 1726776638.10485: done transferring module to remote 9292 1726776638.10496: _low_level_execute_command(): starting 9292 1726776638.10500: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776637.6487093-9292-94765016013145/ /root/.ansible/tmp/ansible-tmp-1726776637.6487093-9292-94765016013145/AnsiballZ_systemd.py && sleep 0' 9292 1726776638.12933: stderr chunk (state=2): >>><<< 9292 1726776638.12940: stdout chunk (state=2): >>><<< 9292 1726776638.12956: _low_level_execute_command() done: rc=0, stdout=, stderr= 9292 1726776638.12960: _low_level_execute_command(): starting 9292 1726776638.12965: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776637.6487093-9292-94765016013145/AnsiballZ_systemd.py && sleep 0' 9292 1726776638.40709: stdout chunk (state=2): >>> {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:10:31 EDT", "WatchdogTimestampMonotonic": "239516555", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "9800", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:31 EDT", "ExecMainStartTimestampMonotonic": "239379769", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "9800", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:31 EDT] ; stop_time=[n/a] ; pid=9800 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "15020032", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "sysinit.target system.slice dbus.socket dbus.service", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service tlp.service power-profiles-daemon.service shutdown.target cpupower.service", "Before": "multi-user.target shutdown.target", "After": "system.slice systemd-journald.socket dbus.service network.target sysinit.target basic.target dbus.socket systemd-sysctl.service polkit.service", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:10:31 EDT", "Stat<<< 9292 1726776638.40727: stdout chunk (state=3): >>>eChangeTimestampMonotonic": "239516558", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:31 EDT", "InactiveExitTimestampMonotonic": "239379946", "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:31 EDT", "ActiveEnterTimestampMonotonic": "239516558", "ActiveExitTimestamp": "Thu 2024-09-19 16:10:31 EDT", "ActiveExitTimestampMonotonic": "239266126", "InactiveEnterTimestamp": "Thu 2024-09-19 16:10:31 EDT", "InactiveEnterTimestampMonotonic": "239376765", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:31 EDT", "ConditionTimestampMonotonic": "239377966", "AssertTimestamp": "Thu 2024-09-19 16:10:31 EDT", "AssertTimestampMonotonic": "239377967", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "d22f365eaba74b3a87958cacc5d42cbe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 9292 1726776638.42338: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 9292 1726776638.42383: stderr chunk (state=3): >>><<< 9292 1726776638.42393: stdout chunk (state=3): >>><<< 9292 1726776638.42412: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:10:31 EDT", "WatchdogTimestampMonotonic": "239516555", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "9800", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:31 EDT", "ExecMainStartTimestampMonotonic": "239379769", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "9800", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:31 EDT] ; stop_time=[n/a] ; pid=9800 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "15020032", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "sysinit.target system.slice dbus.socket dbus.service", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service tlp.service power-profiles-daemon.service shutdown.target cpupower.service", "Before": "multi-user.target shutdown.target", "After": "system.slice systemd-journald.socket dbus.service network.target sysinit.target basic.target dbus.socket systemd-sysctl.service polkit.service", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:10:31 EDT", "StateChangeTimestampMonotonic": "239516558", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:31 EDT", "InactiveExitTimestampMonotonic": "239379946", "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:31 EDT", "ActiveEnterTimestampMonotonic": "239516558", "ActiveExitTimestamp": "Thu 2024-09-19 16:10:31 EDT", "ActiveExitTimestampMonotonic": "239266126", "InactiveEnterTimestamp": "Thu 2024-09-19 16:10:31 EDT", "InactiveEnterTimestampMonotonic": "239376765", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:31 EDT", "ConditionTimestampMonotonic": "239377966", "AssertTimestamp": "Thu 2024-09-19 16:10:31 EDT", "AssertTimestampMonotonic": "239377967", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "d22f365eaba74b3a87958cacc5d42cbe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.8.186 closed. 9292 1726776638.42646: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776637.6487093-9292-94765016013145/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9292 1726776638.42663: _low_level_execute_command(): starting 9292 1726776638.42669: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776637.6487093-9292-94765016013145/ > /dev/null 2>&1 && sleep 0' 9292 1726776638.45013: stderr chunk (state=2): >>><<< 9292 1726776638.45021: stdout chunk (state=2): >>><<< 9292 1726776638.45036: _low_level_execute_command() done: rc=0, stdout=, stderr= 9292 1726776638.45043: handler run complete 9292 1726776638.45077: attempt loop complete, returning result 9292 1726776638.45096: variable 'item' from source: unknown 9292 1726776638.45158: variable 'item' from source: unknown ok: [managed_node3] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:31 EDT", "ActiveEnterTimestampMonotonic": "239516558", "ActiveExitTimestamp": "Thu 2024-09-19 16:10:31 EDT", "ActiveExitTimestampMonotonic": "239266126", "ActiveState": "active", "After": "system.slice systemd-journald.socket dbus.service network.target sysinit.target basic.target dbus.socket systemd-sysctl.service polkit.service", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 16:10:31 EDT", "AssertTimestampMonotonic": "239377967", "Before": "multi-user.target shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:31 EDT", "ConditionTimestampMonotonic": "239377966", "ConfigurationDirectoryMode": "0755", "Conflicts": "auto-cpufreq.service tlp.service power-profiles-daemon.service shutdown.target cpupower.service", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "9800", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:31 EDT", "ExecMainStartTimestampMonotonic": "239379769", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:31 EDT] ; stop_time=[n/a] ; pid=9800 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestamp": "Thu 2024-09-19 16:10:31 EDT", "InactiveEnterTimestampMonotonic": "239376765", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:31 EDT", "InactiveExitTimestampMonotonic": "239379946", "InvocationID": "d22f365eaba74b3a87958cacc5d42cbe", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "9800", "MemoryAccounting": "yes", "MemoryCurrent": "15020032", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "sysinit.target system.slice dbus.socket dbus.service", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 16:10:31 EDT", "StateChangeTimestampMonotonic": "239516558", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 16:10:31 EDT", "WatchdogTimestampMonotonic": "239516555", "WatchdogUSec": "0" } } 9292 1726776638.45256: dumping result to json 9292 1726776638.45276: done dumping result, returning 9292 1726776638.45284: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started [120fa90a-8a95-c4e4-06a7-000000000200] 9292 1726776638.45293: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000200 9292 1726776638.45401: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000200 9292 1726776638.45405: WORKER PROCESS EXITING 8283 1726776638.45860: no more pending results, returning what we have 8283 1726776638.45863: results queue empty 8283 1726776638.45863: checking for any_errors_fatal 8283 1726776638.45865: done checking for any_errors_fatal 8283 1726776638.45865: checking for max_fail_percentage 8283 1726776638.45866: done checking for max_fail_percentage 8283 1726776638.45866: checking to see if all hosts have failed and the running result is not ok 8283 1726776638.45867: done checking to see if all hosts have failed 8283 1726776638.45867: getting the remaining hosts for this loop 8283 1726776638.45868: done getting the remaining hosts for this loop 8283 1726776638.45870: getting the next task for host managed_node3 8283 1726776638.45874: done getting next task for host managed_node3 8283 1726776638.45876: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8283 1726776638.45878: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776638.45883: getting variables 8283 1726776638.45884: in VariableManager get_vars() 8283 1726776638.45905: Calling all_inventory to load vars for managed_node3 8283 1726776638.45907: Calling groups_inventory to load vars for managed_node3 8283 1726776638.45908: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776638.45914: Calling all_plugins_play to load vars for managed_node3 8283 1726776638.45916: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776638.45917: Calling groups_plugins_play to load vars for managed_node3 8283 1726776638.45954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776638.45981: done with get_vars() 8283 1726776638.45986: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74 Thursday 19 September 2024 16:10:38 -0400 (0:00:00.852) 0:00:22.165 **** 8283 1726776638.46053: entering _queue_task() for managed_node3/file 8283 1726776638.46212: worker is 1 (out of 1 available) 8283 1726776638.46226: exiting _queue_task() for managed_node3/file 8283 1726776638.46237: done queuing things up, now waiting for results queue to drain 8283 1726776638.46239: waiting for pending results... 9321 1726776638.46353: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 9321 1726776638.46464: in run() - task 120fa90a-8a95-c4e4-06a7-000000000201 9321 1726776638.46478: variable 'ansible_search_path' from source: unknown 9321 1726776638.46483: variable 'ansible_search_path' from source: unknown 9321 1726776638.46510: calling self._execute() 9321 1726776638.46563: variable 'ansible_host' from source: host vars for 'managed_node3' 9321 1726776638.46572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9321 1726776638.46581: variable 'omit' from source: magic vars 9321 1726776638.46651: variable 'omit' from source: magic vars 9321 1726776638.46689: variable 'omit' from source: magic vars 9321 1726776638.46710: variable '__kernel_settings_profile_dir' from source: role '' all vars 9321 1726776638.46920: variable '__kernel_settings_profile_dir' from source: role '' all vars 9321 1726776638.46992: variable '__kernel_settings_profile_parent' from source: set_fact 9321 1726776638.47000: variable '__kernel_settings_tuned_profile' from source: role '' all vars 9321 1726776638.47030: variable 'omit' from source: magic vars 9321 1726776638.47060: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9321 1726776638.47086: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9321 1726776638.47103: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9321 1726776638.47116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9321 1726776638.47125: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9321 1726776638.47150: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9321 1726776638.47156: variable 'ansible_host' from source: host vars for 'managed_node3' 9321 1726776638.47160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9321 1726776638.47228: Set connection var ansible_module_compression to ZIP_DEFLATED 9321 1726776638.47238: Set connection var ansible_shell_type to sh 9321 1726776638.47244: Set connection var ansible_timeout to 10 9321 1726776638.47250: Set connection var ansible_connection to ssh 9321 1726776638.47257: Set connection var ansible_pipelining to False 9321 1726776638.47262: Set connection var ansible_shell_executable to /bin/sh 9321 1726776638.47277: variable 'ansible_shell_executable' from source: unknown 9321 1726776638.47280: variable 'ansible_connection' from source: unknown 9321 1726776638.47284: variable 'ansible_module_compression' from source: unknown 9321 1726776638.47287: variable 'ansible_shell_type' from source: unknown 9321 1726776638.47290: variable 'ansible_shell_executable' from source: unknown 9321 1726776638.47294: variable 'ansible_host' from source: host vars for 'managed_node3' 9321 1726776638.47298: variable 'ansible_pipelining' from source: unknown 9321 1726776638.47301: variable 'ansible_timeout' from source: unknown 9321 1726776638.47303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9321 1726776638.47437: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9321 1726776638.47445: variable 'omit' from source: magic vars 9321 1726776638.47448: starting attempt loop 9321 1726776638.47451: running the handler 9321 1726776638.47459: _low_level_execute_command(): starting 9321 1726776638.47465: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9321 1726776638.49713: stdout chunk (state=2): >>>/root <<< 9321 1726776638.49831: stderr chunk (state=3): >>><<< 9321 1726776638.49837: stdout chunk (state=3): >>><<< 9321 1726776638.49854: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9321 1726776638.49867: _low_level_execute_command(): starting 9321 1726776638.49874: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776638.4986258-9321-122335513044525 `" && echo ansible-tmp-1726776638.4986258-9321-122335513044525="` echo /root/.ansible/tmp/ansible-tmp-1726776638.4986258-9321-122335513044525 `" ) && sleep 0' 9321 1726776638.52272: stdout chunk (state=2): >>>ansible-tmp-1726776638.4986258-9321-122335513044525=/root/.ansible/tmp/ansible-tmp-1726776638.4986258-9321-122335513044525 <<< 9321 1726776638.52394: stderr chunk (state=3): >>><<< 9321 1726776638.52400: stdout chunk (state=3): >>><<< 9321 1726776638.52412: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776638.4986258-9321-122335513044525=/root/.ansible/tmp/ansible-tmp-1726776638.4986258-9321-122335513044525 , stderr= 9321 1726776638.52446: variable 'ansible_module_compression' from source: unknown 9321 1726776638.52484: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 9321 1726776638.52513: variable 'ansible_facts' from source: unknown 9321 1726776638.52583: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776638.4986258-9321-122335513044525/AnsiballZ_file.py 9321 1726776638.52680: Sending initial data 9321 1726776638.52687: Sent initial data (151 bytes) 9321 1726776638.55081: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpt389k5ix /root/.ansible/tmp/ansible-tmp-1726776638.4986258-9321-122335513044525/AnsiballZ_file.py <<< 9321 1726776638.56062: stderr chunk (state=3): >>><<< 9321 1726776638.56068: stdout chunk (state=3): >>><<< 9321 1726776638.56088: done transferring module to remote 9321 1726776638.56100: _low_level_execute_command(): starting 9321 1726776638.56105: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776638.4986258-9321-122335513044525/ /root/.ansible/tmp/ansible-tmp-1726776638.4986258-9321-122335513044525/AnsiballZ_file.py && sleep 0' 9321 1726776638.58369: stderr chunk (state=2): >>><<< 9321 1726776638.58378: stdout chunk (state=2): >>><<< 9321 1726776638.58392: _low_level_execute_command() done: rc=0, stdout=, stderr= 9321 1726776638.58396: _low_level_execute_command(): starting 9321 1726776638.58401: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776638.4986258-9321-122335513044525/AnsiballZ_file.py && sleep 0' 9321 1726776638.74211: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 9321 1726776638.75314: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 9321 1726776638.75366: stderr chunk (state=3): >>><<< 9321 1726776638.75373: stdout chunk (state=3): >>><<< 9321 1726776638.75392: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.8.186 closed. 9321 1726776638.75425: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'directory', 'mode': '0755', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776638.4986258-9321-122335513044525/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9321 1726776638.75436: _low_level_execute_command(): starting 9321 1726776638.75442: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776638.4986258-9321-122335513044525/ > /dev/null 2>&1 && sleep 0' 9321 1726776638.77828: stderr chunk (state=2): >>><<< 9321 1726776638.77837: stdout chunk (state=2): >>><<< 9321 1726776638.77850: _low_level_execute_command() done: rc=0, stdout=, stderr= 9321 1726776638.77858: handler run complete 9321 1726776638.77876: attempt loop complete, returning result 9321 1726776638.77880: _execute() done 9321 1726776638.77883: dumping result to json 9321 1726776638.77889: done dumping result, returning 9321 1726776638.77898: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists [120fa90a-8a95-c4e4-06a7-000000000201] 9321 1726776638.77904: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000201 9321 1726776638.77939: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000201 9321 1726776638.77942: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/tuned/kernel_settings", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "state": "directory", "uid": 0 } 8283 1726776638.78086: no more pending results, returning what we have 8283 1726776638.78089: results queue empty 8283 1726776638.78089: checking for any_errors_fatal 8283 1726776638.78105: done checking for any_errors_fatal 8283 1726776638.78106: checking for max_fail_percentage 8283 1726776638.78108: done checking for max_fail_percentage 8283 1726776638.78108: checking to see if all hosts have failed and the running result is not ok 8283 1726776638.78109: done checking to see if all hosts have failed 8283 1726776638.78109: getting the remaining hosts for this loop 8283 1726776638.78110: done getting the remaining hosts for this loop 8283 1726776638.78113: getting the next task for host managed_node3 8283 1726776638.78119: done getting next task for host managed_node3 8283 1726776638.78122: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8283 1726776638.78125: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776638.78136: getting variables 8283 1726776638.78137: in VariableManager get_vars() 8283 1726776638.78168: Calling all_inventory to load vars for managed_node3 8283 1726776638.78171: Calling groups_inventory to load vars for managed_node3 8283 1726776638.78173: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776638.78181: Calling all_plugins_play to load vars for managed_node3 8283 1726776638.78183: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776638.78185: Calling groups_plugins_play to load vars for managed_node3 8283 1726776638.78221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776638.78257: done with get_vars() 8283 1726776638.78263: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get active_profile] ********** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:80 Thursday 19 September 2024 16:10:38 -0400 (0:00:00.322) 0:00:22.488 **** 8283 1726776638.78326: entering _queue_task() for managed_node3/slurp 8283 1726776638.78483: worker is 1 (out of 1 available) 8283 1726776638.78496: exiting _queue_task() for managed_node3/slurp 8283 1726776638.78507: done queuing things up, now waiting for results queue to drain 8283 1726776638.78508: waiting for pending results... 9329 1726776638.78634: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 9329 1726776638.78742: in run() - task 120fa90a-8a95-c4e4-06a7-000000000202 9329 1726776638.78757: variable 'ansible_search_path' from source: unknown 9329 1726776638.78761: variable 'ansible_search_path' from source: unknown 9329 1726776638.78791: calling self._execute() 9329 1726776638.78844: variable 'ansible_host' from source: host vars for 'managed_node3' 9329 1726776638.78853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9329 1726776638.78862: variable 'omit' from source: magic vars 9329 1726776638.78932: variable 'omit' from source: magic vars 9329 1726776638.78971: variable 'omit' from source: magic vars 9329 1726776638.78997: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 9329 1726776638.79209: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 9329 1726776638.79271: variable '__kernel_settings_tuned_dir' from source: role '' all vars 9329 1726776638.79302: variable 'omit' from source: magic vars 9329 1726776638.79335: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9329 1726776638.79362: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9329 1726776638.79379: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9329 1726776638.79393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9329 1726776638.79403: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9329 1726776638.79424: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9329 1726776638.79430: variable 'ansible_host' from source: host vars for 'managed_node3' 9329 1726776638.79434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9329 1726776638.79504: Set connection var ansible_module_compression to ZIP_DEFLATED 9329 1726776638.79512: Set connection var ansible_shell_type to sh 9329 1726776638.79518: Set connection var ansible_timeout to 10 9329 1726776638.79522: Set connection var ansible_connection to ssh 9329 1726776638.79526: Set connection var ansible_pipelining to False 9329 1726776638.79538: Set connection var ansible_shell_executable to /bin/sh 9329 1726776638.79553: variable 'ansible_shell_executable' from source: unknown 9329 1726776638.79557: variable 'ansible_connection' from source: unknown 9329 1726776638.79560: variable 'ansible_module_compression' from source: unknown 9329 1726776638.79563: variable 'ansible_shell_type' from source: unknown 9329 1726776638.79566: variable 'ansible_shell_executable' from source: unknown 9329 1726776638.79570: variable 'ansible_host' from source: host vars for 'managed_node3' 9329 1726776638.79575: variable 'ansible_pipelining' from source: unknown 9329 1726776638.79578: variable 'ansible_timeout' from source: unknown 9329 1726776638.79582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9329 1726776638.79718: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9329 1726776638.79730: variable 'omit' from source: magic vars 9329 1726776638.79735: starting attempt loop 9329 1726776638.79739: running the handler 9329 1726776638.79750: _low_level_execute_command(): starting 9329 1726776638.79757: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9329 1726776638.82031: stdout chunk (state=2): >>>/root <<< 9329 1726776638.82147: stderr chunk (state=3): >>><<< 9329 1726776638.82154: stdout chunk (state=3): >>><<< 9329 1726776638.82172: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9329 1726776638.82185: _low_level_execute_command(): starting 9329 1726776638.82193: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776638.8217912-9329-73306164899932 `" && echo ansible-tmp-1726776638.8217912-9329-73306164899932="` echo /root/.ansible/tmp/ansible-tmp-1726776638.8217912-9329-73306164899932 `" ) && sleep 0' 9329 1726776638.84604: stdout chunk (state=2): >>>ansible-tmp-1726776638.8217912-9329-73306164899932=/root/.ansible/tmp/ansible-tmp-1726776638.8217912-9329-73306164899932 <<< 9329 1726776638.84723: stderr chunk (state=3): >>><<< 9329 1726776638.84732: stdout chunk (state=3): >>><<< 9329 1726776638.84749: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776638.8217912-9329-73306164899932=/root/.ansible/tmp/ansible-tmp-1726776638.8217912-9329-73306164899932 , stderr= 9329 1726776638.84787: variable 'ansible_module_compression' from source: unknown 9329 1726776638.84824: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.slurp-ZIP_DEFLATED 9329 1726776638.84855: variable 'ansible_facts' from source: unknown 9329 1726776638.84926: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776638.8217912-9329-73306164899932/AnsiballZ_slurp.py 9329 1726776638.85032: Sending initial data 9329 1726776638.85039: Sent initial data (151 bytes) 9329 1726776638.87474: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpz7wj0yq3 /root/.ansible/tmp/ansible-tmp-1726776638.8217912-9329-73306164899932/AnsiballZ_slurp.py <<< 9329 1726776638.88434: stderr chunk (state=3): >>><<< 9329 1726776638.88441: stdout chunk (state=3): >>><<< 9329 1726776638.88460: done transferring module to remote 9329 1726776638.88470: _low_level_execute_command(): starting 9329 1726776638.88475: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776638.8217912-9329-73306164899932/ /root/.ansible/tmp/ansible-tmp-1726776638.8217912-9329-73306164899932/AnsiballZ_slurp.py && sleep 0' 9329 1726776638.90748: stderr chunk (state=2): >>><<< 9329 1726776638.90755: stdout chunk (state=2): >>><<< 9329 1726776638.90766: _low_level_execute_command() done: rc=0, stdout=, stderr= 9329 1726776638.90769: _low_level_execute_command(): starting 9329 1726776638.90773: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776638.8217912-9329-73306164899932/AnsiballZ_slurp.py && sleep 0' 9329 1726776639.05562: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 9329 1726776639.06585: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 9329 1726776639.06639: stderr chunk (state=3): >>><<< 9329 1726776639.06647: stdout chunk (state=3): >>><<< 9329 1726776639.06664: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.8.186 closed. 9329 1726776639.06688: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776638.8217912-9329-73306164899932/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9329 1726776639.06703: _low_level_execute_command(): starting 9329 1726776639.06710: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776638.8217912-9329-73306164899932/ > /dev/null 2>&1 && sleep 0' 9329 1726776639.09325: stderr chunk (state=2): >>><<< 9329 1726776639.09336: stdout chunk (state=2): >>><<< 9329 1726776639.09356: _low_level_execute_command() done: rc=0, stdout=, stderr= 9329 1726776639.09363: handler run complete 9329 1726776639.09376: attempt loop complete, returning result 9329 1726776639.09380: _execute() done 9329 1726776639.09384: dumping result to json 9329 1726776639.09394: done dumping result, returning 9329 1726776639.09403: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile [120fa90a-8a95-c4e4-06a7-000000000202] 9329 1726776639.09409: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000202 9329 1726776639.09451: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000202 9329 1726776639.09455: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "encoding": "base64", "source": "/etc/tuned/active_profile" } 8283 1726776639.09600: no more pending results, returning what we have 8283 1726776639.09603: results queue empty 8283 1726776639.09604: checking for any_errors_fatal 8283 1726776639.09612: done checking for any_errors_fatal 8283 1726776639.09613: checking for max_fail_percentage 8283 1726776639.09614: done checking for max_fail_percentage 8283 1726776639.09615: checking to see if all hosts have failed and the running result is not ok 8283 1726776639.09615: done checking to see if all hosts have failed 8283 1726776639.09616: getting the remaining hosts for this loop 8283 1726776639.09617: done getting the remaining hosts for this loop 8283 1726776639.09620: getting the next task for host managed_node3 8283 1726776639.09626: done getting next task for host managed_node3 8283 1726776639.09631: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8283 1726776639.09634: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776639.09644: getting variables 8283 1726776639.09645: in VariableManager get_vars() 8283 1726776639.09675: Calling all_inventory to load vars for managed_node3 8283 1726776639.09678: Calling groups_inventory to load vars for managed_node3 8283 1726776639.09680: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776639.09688: Calling all_plugins_play to load vars for managed_node3 8283 1726776639.09692: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776639.09695: Calling groups_plugins_play to load vars for managed_node3 8283 1726776639.09742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776639.09789: done with get_vars() 8283 1726776639.09799: done getting variables 8283 1726776639.09849: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set active_profile] ********** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:85 Thursday 19 September 2024 16:10:39 -0400 (0:00:00.315) 0:00:22.803 **** 8283 1726776639.09872: entering _queue_task() for managed_node3/set_fact 8283 1726776639.10062: worker is 1 (out of 1 available) 8283 1726776639.10077: exiting _queue_task() for managed_node3/set_fact 8283 1726776639.10088: done queuing things up, now waiting for results queue to drain 8283 1726776639.10092: waiting for pending results... 9337 1726776639.10221: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 9337 1726776639.10367: in run() - task 120fa90a-8a95-c4e4-06a7-000000000203 9337 1726776639.10384: variable 'ansible_search_path' from source: unknown 9337 1726776639.10388: variable 'ansible_search_path' from source: unknown 9337 1726776639.10418: calling self._execute() 9337 1726776639.10488: variable 'ansible_host' from source: host vars for 'managed_node3' 9337 1726776639.10497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9337 1726776639.10506: variable 'omit' from source: magic vars 9337 1726776639.10581: variable 'omit' from source: magic vars 9337 1726776639.10618: variable 'omit' from source: magic vars 9337 1726776639.10898: variable '__kernel_settings_tuned_profile' from source: role '' all vars 9337 1726776639.10908: variable '__cur_profile' from source: task vars 9337 1726776639.11013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9337 1726776639.12541: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9337 1726776639.12585: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9337 1726776639.12616: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9337 1726776639.12645: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9337 1726776639.12664: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9337 1726776639.12719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9337 1726776639.12744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9337 1726776639.12762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9337 1726776639.12789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9337 1726776639.12803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9337 1726776639.12877: variable '__kernel_settings_tuned_current_profile' from source: set_fact 9337 1726776639.12916: variable 'omit' from source: magic vars 9337 1726776639.12942: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9337 1726776639.12964: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9337 1726776639.12979: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9337 1726776639.12994: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9337 1726776639.13004: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9337 1726776639.13026: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9337 1726776639.13033: variable 'ansible_host' from source: host vars for 'managed_node3' 9337 1726776639.13037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9337 1726776639.13100: Set connection var ansible_module_compression to ZIP_DEFLATED 9337 1726776639.13108: Set connection var ansible_shell_type to sh 9337 1726776639.13115: Set connection var ansible_timeout to 10 9337 1726776639.13120: Set connection var ansible_connection to ssh 9337 1726776639.13127: Set connection var ansible_pipelining to False 9337 1726776639.13133: Set connection var ansible_shell_executable to /bin/sh 9337 1726776639.13149: variable 'ansible_shell_executable' from source: unknown 9337 1726776639.13154: variable 'ansible_connection' from source: unknown 9337 1726776639.13158: variable 'ansible_module_compression' from source: unknown 9337 1726776639.13162: variable 'ansible_shell_type' from source: unknown 9337 1726776639.13165: variable 'ansible_shell_executable' from source: unknown 9337 1726776639.13167: variable 'ansible_host' from source: host vars for 'managed_node3' 9337 1726776639.13169: variable 'ansible_pipelining' from source: unknown 9337 1726776639.13171: variable 'ansible_timeout' from source: unknown 9337 1726776639.13173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9337 1726776639.13227: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9337 1726776639.13238: variable 'omit' from source: magic vars 9337 1726776639.13242: starting attempt loop 9337 1726776639.13244: running the handler 9337 1726776639.13251: handler run complete 9337 1726776639.13257: attempt loop complete, returning result 9337 1726776639.13259: _execute() done 9337 1726776639.13262: dumping result to json 9337 1726776639.13264: done dumping result, returning 9337 1726776639.13267: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile [120fa90a-8a95-c4e4-06a7-000000000203] 9337 1726776639.13273: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000203 9337 1726776639.13288: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000203 9337 1726776639.13292: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__kernel_settings_active_profile": "virtual-guest kernel_settings" }, "changed": false } 8283 1726776639.13546: no more pending results, returning what we have 8283 1726776639.13548: results queue empty 8283 1726776639.13548: checking for any_errors_fatal 8283 1726776639.13552: done checking for any_errors_fatal 8283 1726776639.13553: checking for max_fail_percentage 8283 1726776639.13554: done checking for max_fail_percentage 8283 1726776639.13554: checking to see if all hosts have failed and the running result is not ok 8283 1726776639.13554: done checking to see if all hosts have failed 8283 1726776639.13555: getting the remaining hosts for this loop 8283 1726776639.13555: done getting the remaining hosts for this loop 8283 1726776639.13557: getting the next task for host managed_node3 8283 1726776639.13561: done getting next task for host managed_node3 8283 1726776639.13564: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8283 1726776639.13566: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776639.13575: getting variables 8283 1726776639.13576: in VariableManager get_vars() 8283 1726776639.13603: Calling all_inventory to load vars for managed_node3 8283 1726776639.13605: Calling groups_inventory to load vars for managed_node3 8283 1726776639.13606: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776639.13612: Calling all_plugins_play to load vars for managed_node3 8283 1726776639.13614: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776639.13615: Calling groups_plugins_play to load vars for managed_node3 8283 1726776639.13648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776639.13681: done with get_vars() 8283 1726776639.13685: done getting variables 8283 1726776639.13723: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 Thursday 19 September 2024 16:10:39 -0400 (0:00:00.038) 0:00:22.842 **** 8283 1726776639.13748: entering _queue_task() for managed_node3/copy 8283 1726776639.13893: worker is 1 (out of 1 available) 8283 1726776639.13906: exiting _queue_task() for managed_node3/copy 8283 1726776639.13917: done queuing things up, now waiting for results queue to drain 8283 1726776639.13919: waiting for pending results... 9338 1726776639.14034: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 9338 1726776639.14138: in run() - task 120fa90a-8a95-c4e4-06a7-000000000204 9338 1726776639.14153: variable 'ansible_search_path' from source: unknown 9338 1726776639.14157: variable 'ansible_search_path' from source: unknown 9338 1726776639.14183: calling self._execute() 9338 1726776639.14233: variable 'ansible_host' from source: host vars for 'managed_node3' 9338 1726776639.14242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9338 1726776639.14250: variable 'omit' from source: magic vars 9338 1726776639.14319: variable 'omit' from source: magic vars 9338 1726776639.14359: variable 'omit' from source: magic vars 9338 1726776639.14380: variable '__kernel_settings_active_profile' from source: set_fact 9338 1726776639.14581: variable '__kernel_settings_active_profile' from source: set_fact 9338 1726776639.14604: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 9338 1726776639.14655: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 9338 1726776639.14710: variable '__kernel_settings_tuned_dir' from source: role '' all vars 9338 1726776639.14733: variable 'omit' from source: magic vars 9338 1726776639.14763: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9338 1726776639.14788: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9338 1726776639.14809: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9338 1726776639.14822: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9338 1726776639.14834: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9338 1726776639.14856: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9338 1726776639.14860: variable 'ansible_host' from source: host vars for 'managed_node3' 9338 1726776639.14864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9338 1726776639.14933: Set connection var ansible_module_compression to ZIP_DEFLATED 9338 1726776639.14941: Set connection var ansible_shell_type to sh 9338 1726776639.14948: Set connection var ansible_timeout to 10 9338 1726776639.14954: Set connection var ansible_connection to ssh 9338 1726776639.14960: Set connection var ansible_pipelining to False 9338 1726776639.14966: Set connection var ansible_shell_executable to /bin/sh 9338 1726776639.14979: variable 'ansible_shell_executable' from source: unknown 9338 1726776639.14983: variable 'ansible_connection' from source: unknown 9338 1726776639.14986: variable 'ansible_module_compression' from source: unknown 9338 1726776639.14992: variable 'ansible_shell_type' from source: unknown 9338 1726776639.14996: variable 'ansible_shell_executable' from source: unknown 9338 1726776639.14999: variable 'ansible_host' from source: host vars for 'managed_node3' 9338 1726776639.15004: variable 'ansible_pipelining' from source: unknown 9338 1726776639.15007: variable 'ansible_timeout' from source: unknown 9338 1726776639.15010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9338 1726776639.15097: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9338 1726776639.15107: variable 'omit' from source: magic vars 9338 1726776639.15113: starting attempt loop 9338 1726776639.15116: running the handler 9338 1726776639.15126: _low_level_execute_command(): starting 9338 1726776639.15134: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9338 1726776639.17414: stdout chunk (state=2): >>>/root <<< 9338 1726776639.17537: stderr chunk (state=3): >>><<< 9338 1726776639.17543: stdout chunk (state=3): >>><<< 9338 1726776639.17558: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9338 1726776639.17568: _low_level_execute_command(): starting 9338 1726776639.17571: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776639.1756332-9338-183747929631092 `" && echo ansible-tmp-1726776639.1756332-9338-183747929631092="` echo /root/.ansible/tmp/ansible-tmp-1726776639.1756332-9338-183747929631092 `" ) && sleep 0' 9338 1726776639.19957: stdout chunk (state=2): >>>ansible-tmp-1726776639.1756332-9338-183747929631092=/root/.ansible/tmp/ansible-tmp-1726776639.1756332-9338-183747929631092 <<< 9338 1726776639.20082: stderr chunk (state=3): >>><<< 9338 1726776639.20088: stdout chunk (state=3): >>><<< 9338 1726776639.20101: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776639.1756332-9338-183747929631092=/root/.ansible/tmp/ansible-tmp-1726776639.1756332-9338-183747929631092 , stderr= 9338 1726776639.20168: variable 'ansible_module_compression' from source: unknown 9338 1726776639.20210: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 9338 1726776639.20238: variable 'ansible_facts' from source: unknown 9338 1726776639.20305: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776639.1756332-9338-183747929631092/AnsiballZ_stat.py 9338 1726776639.20465: Sending initial data 9338 1726776639.20472: Sent initial data (151 bytes) 9338 1726776639.22836: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpu5o3eeag /root/.ansible/tmp/ansible-tmp-1726776639.1756332-9338-183747929631092/AnsiballZ_stat.py <<< 9338 1726776639.23812: stderr chunk (state=3): >>><<< 9338 1726776639.23821: stdout chunk (state=3): >>><<< 9338 1726776639.23843: done transferring module to remote 9338 1726776639.23855: _low_level_execute_command(): starting 9338 1726776639.23861: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776639.1756332-9338-183747929631092/ /root/.ansible/tmp/ansible-tmp-1726776639.1756332-9338-183747929631092/AnsiballZ_stat.py && sleep 0' 9338 1726776639.26220: stderr chunk (state=2): >>><<< 9338 1726776639.26231: stdout chunk (state=2): >>><<< 9338 1726776639.26246: _low_level_execute_command() done: rc=0, stdout=, stderr= 9338 1726776639.26251: _low_level_execute_command(): starting 9338 1726776639.26256: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776639.1756332-9338-183747929631092/AnsiballZ_stat.py && sleep 0' 9338 1726776639.42225: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 499122370, "dev": 51713, "nlink": 1, "atime": 1726776639.0549896, "mtime": 1726776631.5598924, "ctime": 1726776631.5598924, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "3787864203", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 9338 1726776639.43341: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 9338 1726776639.43388: stderr chunk (state=3): >>><<< 9338 1726776639.43396: stdout chunk (state=3): >>><<< 9338 1726776639.43411: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 499122370, "dev": 51713, "nlink": 1, "atime": 1726776639.0549896, "mtime": 1726776631.5598924, "ctime": 1726776631.5598924, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "3787864203", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.8.186 closed. 9338 1726776639.43459: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776639.1756332-9338-183747929631092/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9338 1726776639.43495: variable 'ansible_module_compression' from source: unknown 9338 1726776639.43526: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 9338 1726776639.43546: variable 'ansible_facts' from source: unknown 9338 1726776639.43603: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776639.1756332-9338-183747929631092/AnsiballZ_file.py 9338 1726776639.43688: Sending initial data 9338 1726776639.43695: Sent initial data (151 bytes) 9338 1726776639.46174: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmp8hq04rmc /root/.ansible/tmp/ansible-tmp-1726776639.1756332-9338-183747929631092/AnsiballZ_file.py <<< 9338 1726776639.47169: stderr chunk (state=3): >>><<< 9338 1726776639.47176: stdout chunk (state=3): >>><<< 9338 1726776639.47194: done transferring module to remote 9338 1726776639.47203: _low_level_execute_command(): starting 9338 1726776639.47208: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776639.1756332-9338-183747929631092/ /root/.ansible/tmp/ansible-tmp-1726776639.1756332-9338-183747929631092/AnsiballZ_file.py && sleep 0' 9338 1726776639.49504: stderr chunk (state=2): >>><<< 9338 1726776639.49513: stdout chunk (state=2): >>><<< 9338 1726776639.49524: _low_level_execute_command() done: rc=0, stdout=, stderr= 9338 1726776639.49527: _low_level_execute_command(): starting 9338 1726776639.49533: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776639.1756332-9338-183747929631092/AnsiballZ_file.py && sleep 0' 9338 1726776639.65486: stdout chunk (state=2): >>> <<< 9338 1726776639.65501: stdout chunk (state=3): >>>{"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmpdfx1o5l1", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 9338 1726776639.66617: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 9338 1726776639.66666: stderr chunk (state=3): >>><<< 9338 1726776639.66673: stdout chunk (state=3): >>><<< 9338 1726776639.66689: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmpdfx1o5l1", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.8.186 closed. 9338 1726776639.66717: done with _execute_module (ansible.legacy.file, {'mode': '0600', 'dest': '/etc/tuned/active_profile', '_original_basename': 'tmpdfx1o5l1', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776639.1756332-9338-183747929631092/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9338 1726776639.66728: _low_level_execute_command(): starting 9338 1726776639.66735: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776639.1756332-9338-183747929631092/ > /dev/null 2>&1 && sleep 0' 9338 1726776639.69130: stderr chunk (state=2): >>><<< 9338 1726776639.69137: stdout chunk (state=2): >>><<< 9338 1726776639.69152: _low_level_execute_command() done: rc=0, stdout=, stderr= 9338 1726776639.69161: handler run complete 9338 1726776639.69181: attempt loop complete, returning result 9338 1726776639.69184: _execute() done 9338 1726776639.69188: dumping result to json 9338 1726776639.69194: done dumping result, returning 9338 1726776639.69202: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile [120fa90a-8a95-c4e4-06a7-000000000204] 9338 1726776639.69208: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000204 9338 1726776639.69244: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000204 9338 1726776639.69247: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/active_profile", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "state": "file", "uid": 0 } 8283 1726776639.69412: no more pending results, returning what we have 8283 1726776639.69415: results queue empty 8283 1726776639.69416: checking for any_errors_fatal 8283 1726776639.69421: done checking for any_errors_fatal 8283 1726776639.69421: checking for max_fail_percentage 8283 1726776639.69423: done checking for max_fail_percentage 8283 1726776639.69423: checking to see if all hosts have failed and the running result is not ok 8283 1726776639.69424: done checking to see if all hosts have failed 8283 1726776639.69424: getting the remaining hosts for this loop 8283 1726776639.69425: done getting the remaining hosts for this loop 8283 1726776639.69430: getting the next task for host managed_node3 8283 1726776639.69436: done getting next task for host managed_node3 8283 1726776639.69439: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8283 1726776639.69442: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776639.69451: getting variables 8283 1726776639.69452: in VariableManager get_vars() 8283 1726776639.69482: Calling all_inventory to load vars for managed_node3 8283 1726776639.69484: Calling groups_inventory to load vars for managed_node3 8283 1726776639.69486: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776639.69494: Calling all_plugins_play to load vars for managed_node3 8283 1726776639.69496: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776639.69498: Calling groups_plugins_play to load vars for managed_node3 8283 1726776639.69536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776639.69569: done with get_vars() 8283 1726776639.69576: done getting variables 8283 1726776639.69616: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set profile_mode to manual] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 Thursday 19 September 2024 16:10:39 -0400 (0:00:00.558) 0:00:23.401 **** 8283 1726776639.69642: entering _queue_task() for managed_node3/copy 8283 1726776639.69802: worker is 1 (out of 1 available) 8283 1726776639.69815: exiting _queue_task() for managed_node3/copy 8283 1726776639.69827: done queuing things up, now waiting for results queue to drain 8283 1726776639.69830: waiting for pending results... 9358 1726776639.69945: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 9358 1726776639.70049: in run() - task 120fa90a-8a95-c4e4-06a7-000000000205 9358 1726776639.70066: variable 'ansible_search_path' from source: unknown 9358 1726776639.70071: variable 'ansible_search_path' from source: unknown 9358 1726776639.70098: calling self._execute() 9358 1726776639.70152: variable 'ansible_host' from source: host vars for 'managed_node3' 9358 1726776639.70160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9358 1726776639.70166: variable 'omit' from source: magic vars 9358 1726776639.70234: variable 'omit' from source: magic vars 9358 1726776639.70270: variable 'omit' from source: magic vars 9358 1726776639.70289: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 9358 1726776639.70497: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 9358 1726776639.70626: variable '__kernel_settings_tuned_dir' from source: role '' all vars 9358 1726776639.70654: variable 'omit' from source: magic vars 9358 1726776639.70686: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9358 1726776639.70714: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9358 1726776639.70734: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9358 1726776639.70748: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9358 1726776639.70760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9358 1726776639.70782: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9358 1726776639.70787: variable 'ansible_host' from source: host vars for 'managed_node3' 9358 1726776639.70791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9358 1726776639.70858: Set connection var ansible_module_compression to ZIP_DEFLATED 9358 1726776639.70866: Set connection var ansible_shell_type to sh 9358 1726776639.70873: Set connection var ansible_timeout to 10 9358 1726776639.70878: Set connection var ansible_connection to ssh 9358 1726776639.70885: Set connection var ansible_pipelining to False 9358 1726776639.70891: Set connection var ansible_shell_executable to /bin/sh 9358 1726776639.70905: variable 'ansible_shell_executable' from source: unknown 9358 1726776639.70909: variable 'ansible_connection' from source: unknown 9358 1726776639.70912: variable 'ansible_module_compression' from source: unknown 9358 1726776639.70915: variable 'ansible_shell_type' from source: unknown 9358 1726776639.70918: variable 'ansible_shell_executable' from source: unknown 9358 1726776639.70922: variable 'ansible_host' from source: host vars for 'managed_node3' 9358 1726776639.70925: variable 'ansible_pipelining' from source: unknown 9358 1726776639.70926: variable 'ansible_timeout' from source: unknown 9358 1726776639.70937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9358 1726776639.71019: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9358 1726776639.71027: variable 'omit' from source: magic vars 9358 1726776639.71033: starting attempt loop 9358 1726776639.71035: running the handler 9358 1726776639.71043: _low_level_execute_command(): starting 9358 1726776639.71049: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9358 1726776639.73297: stdout chunk (state=2): >>>/root <<< 9358 1726776639.73412: stderr chunk (state=3): >>><<< 9358 1726776639.73418: stdout chunk (state=3): >>><<< 9358 1726776639.73437: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9358 1726776639.73448: _low_level_execute_command(): starting 9358 1726776639.73454: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776639.7344368-9358-211585200112970 `" && echo ansible-tmp-1726776639.7344368-9358-211585200112970="` echo /root/.ansible/tmp/ansible-tmp-1726776639.7344368-9358-211585200112970 `" ) && sleep 0' 9358 1726776639.75842: stdout chunk (state=2): >>>ansible-tmp-1726776639.7344368-9358-211585200112970=/root/.ansible/tmp/ansible-tmp-1726776639.7344368-9358-211585200112970 <<< 9358 1726776639.75964: stderr chunk (state=3): >>><<< 9358 1726776639.75972: stdout chunk (state=3): >>><<< 9358 1726776639.75987: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776639.7344368-9358-211585200112970=/root/.ansible/tmp/ansible-tmp-1726776639.7344368-9358-211585200112970 , stderr= 9358 1726776639.76048: variable 'ansible_module_compression' from source: unknown 9358 1726776639.76090: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 9358 1726776639.76118: variable 'ansible_facts' from source: unknown 9358 1726776639.76187: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776639.7344368-9358-211585200112970/AnsiballZ_stat.py 9358 1726776639.76268: Sending initial data 9358 1726776639.76275: Sent initial data (151 bytes) 9358 1726776639.78675: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpgqunv424 /root/.ansible/tmp/ansible-tmp-1726776639.7344368-9358-211585200112970/AnsiballZ_stat.py <<< 9358 1726776639.79644: stderr chunk (state=3): >>><<< 9358 1726776639.79650: stdout chunk (state=3): >>><<< 9358 1726776639.79665: done transferring module to remote 9358 1726776639.79674: _low_level_execute_command(): starting 9358 1726776639.79679: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776639.7344368-9358-211585200112970/ /root/.ansible/tmp/ansible-tmp-1726776639.7344368-9358-211585200112970/AnsiballZ_stat.py && sleep 0' 9358 1726776639.81909: stderr chunk (state=2): >>><<< 9358 1726776639.81915: stdout chunk (state=2): >>><<< 9358 1726776639.81926: _low_level_execute_command() done: rc=0, stdout=, stderr= 9358 1726776639.81932: _low_level_execute_command(): starting 9358 1726776639.81937: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776639.7344368-9358-211585200112970/AnsiballZ_stat.py && sleep 0' 9358 1726776639.97766: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 505413763, "dev": 51713, "nlink": 1, "atime": 1726776631.5368922, "mtime": 1726776631.5608926, "ctime": 1726776631.5608926, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "3997735162", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 9358 1726776639.98922: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 9358 1726776639.98970: stderr chunk (state=3): >>><<< 9358 1726776639.98978: stdout chunk (state=3): >>><<< 9358 1726776639.98996: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 505413763, "dev": 51713, "nlink": 1, "atime": 1726776631.5368922, "mtime": 1726776631.5608926, "ctime": 1726776631.5608926, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "3997735162", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.8.186 closed. 9358 1726776639.99048: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776639.7344368-9358-211585200112970/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9358 1726776639.99083: variable 'ansible_module_compression' from source: unknown 9358 1726776639.99118: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 9358 1726776639.99137: variable 'ansible_facts' from source: unknown 9358 1726776639.99197: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776639.7344368-9358-211585200112970/AnsiballZ_file.py 9358 1726776639.99285: Sending initial data 9358 1726776639.99295: Sent initial data (151 bytes) 9358 1726776640.01827: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpfbx5e7jl /root/.ansible/tmp/ansible-tmp-1726776639.7344368-9358-211585200112970/AnsiballZ_file.py <<< 9358 1726776640.02841: stderr chunk (state=3): >>><<< 9358 1726776640.02848: stdout chunk (state=3): >>><<< 9358 1726776640.02865: done transferring module to remote 9358 1726776640.02874: _low_level_execute_command(): starting 9358 1726776640.02879: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776639.7344368-9358-211585200112970/ /root/.ansible/tmp/ansible-tmp-1726776639.7344368-9358-211585200112970/AnsiballZ_file.py && sleep 0' 9358 1726776640.05215: stderr chunk (state=2): >>><<< 9358 1726776640.05222: stdout chunk (state=2): >>><<< 9358 1726776640.05235: _low_level_execute_command() done: rc=0, stdout=, stderr= 9358 1726776640.05239: _low_level_execute_command(): starting 9358 1726776640.05245: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776639.7344368-9358-211585200112970/AnsiballZ_file.py && sleep 0' 9358 1726776640.21128: stdout chunk (state=2): >>> {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmp68woqxg1", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 9358 1726776640.22257: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 9358 1726776640.22308: stderr chunk (state=3): >>><<< 9358 1726776640.22316: stdout chunk (state=3): >>><<< 9358 1726776640.22334: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmp68woqxg1", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.8.186 closed. 9358 1726776640.22363: done with _execute_module (ansible.legacy.file, {'mode': '0600', 'dest': '/etc/tuned/profile_mode', '_original_basename': 'tmp68woqxg1', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776639.7344368-9358-211585200112970/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9358 1726776640.22373: _low_level_execute_command(): starting 9358 1726776640.22379: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776639.7344368-9358-211585200112970/ > /dev/null 2>&1 && sleep 0' 9358 1726776640.24821: stderr chunk (state=2): >>><<< 9358 1726776640.24830: stdout chunk (state=2): >>><<< 9358 1726776640.24844: _low_level_execute_command() done: rc=0, stdout=, stderr= 9358 1726776640.24851: handler run complete 9358 1726776640.24873: attempt loop complete, returning result 9358 1726776640.24877: _execute() done 9358 1726776640.24880: dumping result to json 9358 1726776640.24885: done dumping result, returning 9358 1726776640.24894: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual [120fa90a-8a95-c4e4-06a7-000000000205] 9358 1726776640.24902: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000205 9358 1726776640.24936: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000205 9358 1726776640.24939: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/profile_mode", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "state": "file", "uid": 0 } 8283 1726776640.25103: no more pending results, returning what we have 8283 1726776640.25106: results queue empty 8283 1726776640.25107: checking for any_errors_fatal 8283 1726776640.25114: done checking for any_errors_fatal 8283 1726776640.25115: checking for max_fail_percentage 8283 1726776640.25116: done checking for max_fail_percentage 8283 1726776640.25116: checking to see if all hosts have failed and the running result is not ok 8283 1726776640.25117: done checking to see if all hosts have failed 8283 1726776640.25118: getting the remaining hosts for this loop 8283 1726776640.25118: done getting the remaining hosts for this loop 8283 1726776640.25121: getting the next task for host managed_node3 8283 1726776640.25127: done getting next task for host managed_node3 8283 1726776640.25131: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config 8283 1726776640.25134: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776640.25144: getting variables 8283 1726776640.25145: in VariableManager get_vars() 8283 1726776640.25175: Calling all_inventory to load vars for managed_node3 8283 1726776640.25178: Calling groups_inventory to load vars for managed_node3 8283 1726776640.25179: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776640.25187: Calling all_plugins_play to load vars for managed_node3 8283 1726776640.25189: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776640.25193: Calling groups_plugins_play to load vars for managed_node3 8283 1726776640.25230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776640.25263: done with get_vars() 8283 1726776640.25269: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get current config] ********** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:107 Thursday 19 September 2024 16:10:40 -0400 (0:00:00.556) 0:00:23.958 **** 8283 1726776640.25332: entering _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config 8283 1726776640.25489: worker is 1 (out of 1 available) 8283 1726776640.25506: exiting _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config 8283 1726776640.25517: done queuing things up, now waiting for results queue to drain 8283 1726776640.25519: waiting for pending results... 9370 1726776640.25633: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get current config 9370 1726776640.25735: in run() - task 120fa90a-8a95-c4e4-06a7-000000000206 9370 1726776640.25751: variable 'ansible_search_path' from source: unknown 9370 1726776640.25755: variable 'ansible_search_path' from source: unknown 9370 1726776640.25783: calling self._execute() 9370 1726776640.25909: variable 'ansible_host' from source: host vars for 'managed_node3' 9370 1726776640.25918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9370 1726776640.25926: variable 'omit' from source: magic vars 9370 1726776640.25997: variable 'omit' from source: magic vars 9370 1726776640.26037: variable 'omit' from source: magic vars 9370 1726776640.26058: variable '__kernel_settings_profile_filename' from source: role '' all vars 9370 1726776640.26258: variable '__kernel_settings_profile_filename' from source: role '' all vars 9370 1726776640.26318: variable '__kernel_settings_profile_dir' from source: role '' all vars 9370 1726776640.26378: variable '__kernel_settings_profile_parent' from source: set_fact 9370 1726776640.26385: variable '__kernel_settings_tuned_profile' from source: role '' all vars 9370 1726776640.26416: variable 'omit' from source: magic vars 9370 1726776640.26447: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9370 1726776640.26470: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9370 1726776640.26484: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9370 1726776640.26496: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9370 1726776640.26507: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9370 1726776640.26528: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9370 1726776640.26534: variable 'ansible_host' from source: host vars for 'managed_node3' 9370 1726776640.26539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9370 1726776640.26606: Set connection var ansible_module_compression to ZIP_DEFLATED 9370 1726776640.26614: Set connection var ansible_shell_type to sh 9370 1726776640.26620: Set connection var ansible_timeout to 10 9370 1726776640.26625: Set connection var ansible_connection to ssh 9370 1726776640.26637: Set connection var ansible_pipelining to False 9370 1726776640.26642: Set connection var ansible_shell_executable to /bin/sh 9370 1726776640.26657: variable 'ansible_shell_executable' from source: unknown 9370 1726776640.26661: variable 'ansible_connection' from source: unknown 9370 1726776640.26664: variable 'ansible_module_compression' from source: unknown 9370 1726776640.26667: variable 'ansible_shell_type' from source: unknown 9370 1726776640.26671: variable 'ansible_shell_executable' from source: unknown 9370 1726776640.26674: variable 'ansible_host' from source: host vars for 'managed_node3' 9370 1726776640.26679: variable 'ansible_pipelining' from source: unknown 9370 1726776640.26682: variable 'ansible_timeout' from source: unknown 9370 1726776640.26686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9370 1726776640.26806: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9370 1726776640.26815: variable 'omit' from source: magic vars 9370 1726776640.26821: starting attempt loop 9370 1726776640.26823: running the handler 9370 1726776640.26834: _low_level_execute_command(): starting 9370 1726776640.26840: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9370 1726776640.29138: stdout chunk (state=2): >>>/root <<< 9370 1726776640.29255: stderr chunk (state=3): >>><<< 9370 1726776640.29261: stdout chunk (state=3): >>><<< 9370 1726776640.29277: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9370 1726776640.29290: _low_level_execute_command(): starting 9370 1726776640.29297: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776640.2928395-9370-234998608764232 `" && echo ansible-tmp-1726776640.2928395-9370-234998608764232="` echo /root/.ansible/tmp/ansible-tmp-1726776640.2928395-9370-234998608764232 `" ) && sleep 0' 9370 1726776640.31704: stdout chunk (state=2): >>>ansible-tmp-1726776640.2928395-9370-234998608764232=/root/.ansible/tmp/ansible-tmp-1726776640.2928395-9370-234998608764232 <<< 9370 1726776640.31827: stderr chunk (state=3): >>><<< 9370 1726776640.31834: stdout chunk (state=3): >>><<< 9370 1726776640.31847: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776640.2928395-9370-234998608764232=/root/.ansible/tmp/ansible-tmp-1726776640.2928395-9370-234998608764232 , stderr= 9370 1726776640.31880: variable 'ansible_module_compression' from source: unknown 9370 1726776640.31908: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 9370 1726776640.31938: variable 'ansible_facts' from source: unknown 9370 1726776640.32003: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776640.2928395-9370-234998608764232/AnsiballZ_kernel_settings_get_config.py 9370 1726776640.32095: Sending initial data 9370 1726776640.32101: Sent initial data (173 bytes) 9370 1726776640.34512: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpexk3icei /root/.ansible/tmp/ansible-tmp-1726776640.2928395-9370-234998608764232/AnsiballZ_kernel_settings_get_config.py <<< 9370 1726776640.35462: stderr chunk (state=3): >>><<< 9370 1726776640.35469: stdout chunk (state=3): >>><<< 9370 1726776640.35485: done transferring module to remote 9370 1726776640.35496: _low_level_execute_command(): starting 9370 1726776640.35501: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776640.2928395-9370-234998608764232/ /root/.ansible/tmp/ansible-tmp-1726776640.2928395-9370-234998608764232/AnsiballZ_kernel_settings_get_config.py && sleep 0' 9370 1726776640.37803: stderr chunk (state=2): >>><<< 9370 1726776640.37810: stdout chunk (state=2): >>><<< 9370 1726776640.37821: _low_level_execute_command() done: rc=0, stdout=, stderr= 9370 1726776640.37825: _low_level_execute_command(): starting 9370 1726776640.37831: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776640.2928395-9370-234998608764232/AnsiballZ_kernel_settings_get_config.py && sleep 0' 9370 1726776640.53591: stdout chunk (state=2): >>> {"changed": false, "data": {"main": {"summary": "kernel settings"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} <<< 9370 1726776640.54710: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 9370 1726776640.54761: stderr chunk (state=3): >>><<< 9370 1726776640.54768: stdout chunk (state=3): >>><<< 9370 1726776640.54784: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"main": {"summary": "kernel settings"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} , stderr=Shared connection to 10.31.8.186 closed. 9370 1726776640.54809: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776640.2928395-9370-234998608764232/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9370 1726776640.54821: _low_level_execute_command(): starting 9370 1726776640.54826: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776640.2928395-9370-234998608764232/ > /dev/null 2>&1 && sleep 0' 9370 1726776640.57264: stderr chunk (state=2): >>><<< 9370 1726776640.57274: stdout chunk (state=2): >>><<< 9370 1726776640.57290: _low_level_execute_command() done: rc=0, stdout=, stderr= 9370 1726776640.57298: handler run complete 9370 1726776640.57312: attempt loop complete, returning result 9370 1726776640.57316: _execute() done 9370 1726776640.57320: dumping result to json 9370 1726776640.57324: done dumping result, returning 9370 1726776640.57332: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get current config [120fa90a-8a95-c4e4-06a7-000000000206] 9370 1726776640.57339: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000206 9370 1726776640.57366: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000206 9370 1726776640.57370: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "data": { "main": { "summary": "kernel settings" } } } 8283 1726776640.57594: no more pending results, returning what we have 8283 1726776640.57597: results queue empty 8283 1726776640.57598: checking for any_errors_fatal 8283 1726776640.57602: done checking for any_errors_fatal 8283 1726776640.57602: checking for max_fail_percentage 8283 1726776640.57604: done checking for max_fail_percentage 8283 1726776640.57604: checking to see if all hosts have failed and the running result is not ok 8283 1726776640.57605: done checking to see if all hosts have failed 8283 1726776640.57605: getting the remaining hosts for this loop 8283 1726776640.57606: done getting the remaining hosts for this loop 8283 1726776640.57609: getting the next task for host managed_node3 8283 1726776640.57614: done getting next task for host managed_node3 8283 1726776640.57617: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 8283 1726776640.57620: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776640.57627: getting variables 8283 1726776640.57628: in VariableManager get_vars() 8283 1726776640.57650: Calling all_inventory to load vars for managed_node3 8283 1726776640.57652: Calling groups_inventory to load vars for managed_node3 8283 1726776640.57653: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776640.57659: Calling all_plugins_play to load vars for managed_node3 8283 1726776640.57661: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776640.57662: Calling groups_plugins_play to load vars for managed_node3 8283 1726776640.57698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776640.57727: done with get_vars() 8283 1726776640.57735: done getting variables 8283 1726776640.57775: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Apply kernel settings] ******* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 Thursday 19 September 2024 16:10:40 -0400 (0:00:00.324) 0:00:24.282 **** 8283 1726776640.57799: entering _queue_task() for managed_node3/template 8283 1726776640.57960: worker is 1 (out of 1 available) 8283 1726776640.57974: exiting _queue_task() for managed_node3/template 8283 1726776640.57986: done queuing things up, now waiting for results queue to drain 8283 1726776640.57987: waiting for pending results... 9385 1726776640.58103: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 9385 1726776640.58210: in run() - task 120fa90a-8a95-c4e4-06a7-000000000207 9385 1726776640.58225: variable 'ansible_search_path' from source: unknown 9385 1726776640.58230: variable 'ansible_search_path' from source: unknown 9385 1726776640.58257: calling self._execute() 9385 1726776640.58310: variable 'ansible_host' from source: host vars for 'managed_node3' 9385 1726776640.58319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9385 1726776640.58326: variable 'omit' from source: magic vars 9385 1726776640.58399: variable 'omit' from source: magic vars 9385 1726776640.58434: variable 'omit' from source: magic vars 9385 1726776640.58667: variable '__kernel_settings_profile_src' from source: role '' all vars 9385 1726776640.58676: variable '__kernel_settings_tuned_profile' from source: role '' all vars 9385 1726776640.58736: variable '__kernel_settings_tuned_profile' from source: role '' all vars 9385 1726776640.58755: variable '__kernel_settings_profile_filename' from source: role '' all vars 9385 1726776640.58801: variable '__kernel_settings_profile_filename' from source: role '' all vars 9385 1726776640.58850: variable '__kernel_settings_profile_dir' from source: role '' all vars 9385 1726776640.58911: variable '__kernel_settings_profile_parent' from source: set_fact 9385 1726776640.58918: variable '__kernel_settings_tuned_profile' from source: role '' all vars 9385 1726776640.58943: variable 'omit' from source: magic vars 9385 1726776640.58976: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9385 1726776640.59005: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9385 1726776640.59024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9385 1726776640.59040: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9385 1726776640.59052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9385 1726776640.59075: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9385 1726776640.59079: variable 'ansible_host' from source: host vars for 'managed_node3' 9385 1726776640.59083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9385 1726776640.59152: Set connection var ansible_module_compression to ZIP_DEFLATED 9385 1726776640.59160: Set connection var ansible_shell_type to sh 9385 1726776640.59167: Set connection var ansible_timeout to 10 9385 1726776640.59172: Set connection var ansible_connection to ssh 9385 1726776640.59179: Set connection var ansible_pipelining to False 9385 1726776640.59184: Set connection var ansible_shell_executable to /bin/sh 9385 1726776640.59200: variable 'ansible_shell_executable' from source: unknown 9385 1726776640.59204: variable 'ansible_connection' from source: unknown 9385 1726776640.59207: variable 'ansible_module_compression' from source: unknown 9385 1726776640.59208: variable 'ansible_shell_type' from source: unknown 9385 1726776640.59210: variable 'ansible_shell_executable' from source: unknown 9385 1726776640.59212: variable 'ansible_host' from source: host vars for 'managed_node3' 9385 1726776640.59215: variable 'ansible_pipelining' from source: unknown 9385 1726776640.59217: variable 'ansible_timeout' from source: unknown 9385 1726776640.59219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9385 1726776640.59307: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9385 1726776640.59316: variable 'omit' from source: magic vars 9385 1726776640.59320: starting attempt loop 9385 1726776640.59324: running the handler 9385 1726776640.59334: _low_level_execute_command(): starting 9385 1726776640.59339: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9385 1726776640.61667: stdout chunk (state=2): >>>/root <<< 9385 1726776640.61784: stderr chunk (state=3): >>><<< 9385 1726776640.61790: stdout chunk (state=3): >>><<< 9385 1726776640.61808: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9385 1726776640.61818: _low_level_execute_command(): starting 9385 1726776640.61822: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776640.6181347-9385-3201921594509 `" && echo ansible-tmp-1726776640.6181347-9385-3201921594509="` echo /root/.ansible/tmp/ansible-tmp-1726776640.6181347-9385-3201921594509 `" ) && sleep 0' 9385 1726776640.64240: stdout chunk (state=2): >>>ansible-tmp-1726776640.6181347-9385-3201921594509=/root/.ansible/tmp/ansible-tmp-1726776640.6181347-9385-3201921594509 <<< 9385 1726776640.64368: stderr chunk (state=3): >>><<< 9385 1726776640.64376: stdout chunk (state=3): >>><<< 9385 1726776640.64392: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776640.6181347-9385-3201921594509=/root/.ansible/tmp/ansible-tmp-1726776640.6181347-9385-3201921594509 , stderr= 9385 1726776640.64407: evaluation_path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks 9385 1726776640.64425: search_path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/templates/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/templates/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/templates/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/kernel_settings.j2 9385 1726776640.64445: variable 'ansible_search_path' from source: unknown 9385 1726776640.65032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9385 1726776640.66436: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9385 1726776640.66488: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9385 1726776640.66517: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9385 1726776640.66546: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9385 1726776640.66567: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9385 1726776640.66752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9385 1726776640.66775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9385 1726776640.66797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9385 1726776640.66824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9385 1726776640.66837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9385 1726776640.67058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9385 1726776640.67075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9385 1726776640.67091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9385 1726776640.67114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9385 1726776640.67122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9385 1726776640.67373: variable 'ansible_managed' from source: unknown 9385 1726776640.67381: variable '__sections' from source: task vars 9385 1726776640.67468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9385 1726776640.67486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9385 1726776640.67506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9385 1726776640.67535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9385 1726776640.67546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9385 1726776640.67614: variable 'kernel_settings_sysctl' from source: include params 9385 1726776640.67621: variable '__kernel_settings_state_empty' from source: role '' all vars 9385 1726776640.67630: variable '__kernel_settings_previous_replaced' from source: role '' all vars 9385 1726776640.67659: variable '__sysctl_old' from source: task vars 9385 1726776640.67702: variable '__sysctl_old' from source: task vars 9385 1726776640.67843: variable 'kernel_settings_purge' from source: include params 9385 1726776640.67850: variable 'kernel_settings_sysctl' from source: include params 9385 1726776640.67855: variable '__kernel_settings_state_empty' from source: role '' all vars 9385 1726776640.67861: variable '__kernel_settings_previous_replaced' from source: role '' all vars 9385 1726776640.67865: variable '__kernel_settings_profile_contents' from source: set_fact 9385 1726776640.67988: variable 'kernel_settings_sysfs' from source: include params 9385 1726776640.67995: variable '__kernel_settings_state_empty' from source: role '' all vars 9385 1726776640.68000: variable '__kernel_settings_previous_replaced' from source: role '' all vars 9385 1726776640.68014: variable '__sysfs_old' from source: task vars 9385 1726776640.68058: variable '__sysfs_old' from source: task vars 9385 1726776640.68195: variable 'kernel_settings_purge' from source: include params 9385 1726776640.68201: variable 'kernel_settings_sysfs' from source: include params 9385 1726776640.68207: variable '__kernel_settings_state_empty' from source: role '' all vars 9385 1726776640.68212: variable '__kernel_settings_previous_replaced' from source: role '' all vars 9385 1726776640.68216: variable '__kernel_settings_profile_contents' from source: set_fact 9385 1726776640.68232: variable 'kernel_settings_systemd_cpu_affinity' from source: include params 9385 1726776640.68240: variable '__systemd_old' from source: task vars 9385 1726776640.68283: variable '__systemd_old' from source: task vars 9385 1726776640.68415: variable 'kernel_settings_purge' from source: include params 9385 1726776640.68422: variable 'kernel_settings_systemd_cpu_affinity' from source: include params 9385 1726776640.68427: variable '__kernel_settings_state_absent' from source: role '' all vars 9385 1726776640.68434: variable '__kernel_settings_profile_contents' from source: set_fact 9385 1726776640.68445: variable 'kernel_settings_transparent_hugepages' from source: include params 9385 1726776640.68450: variable 'kernel_settings_transparent_hugepages_defrag' from source: include params 9385 1726776640.68454: variable '__trans_huge_old' from source: task vars 9385 1726776640.68496: variable '__trans_huge_old' from source: task vars 9385 1726776640.68623: variable 'kernel_settings_purge' from source: include params 9385 1726776640.68631: variable 'kernel_settings_transparent_hugepages' from source: include params 9385 1726776640.68636: variable '__kernel_settings_state_absent' from source: role '' all vars 9385 1726776640.68642: variable '__kernel_settings_profile_contents' from source: set_fact 9385 1726776640.68651: variable '__trans_defrag_old' from source: task vars 9385 1726776640.68689: variable '__trans_defrag_old' from source: task vars 9385 1726776640.68820: variable 'kernel_settings_purge' from source: include params 9385 1726776640.68826: variable 'kernel_settings_transparent_hugepages_defrag' from source: include params 9385 1726776640.68836: variable '__kernel_settings_state_absent' from source: role '' all vars 9385 1726776640.68841: variable '__kernel_settings_profile_contents' from source: set_fact 9385 1726776640.68857: variable '__kernel_settings_state_absent' from source: role '' all vars 9385 1726776640.68867: variable '__kernel_settings_state_absent' from source: role '' all vars 9385 1726776640.68874: variable '__kernel_settings_state_absent' from source: role '' all vars 9385 1726776640.69510: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9385 1726776640.69551: variable 'ansible_module_compression' from source: unknown 9385 1726776640.69592: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 9385 1726776640.69610: variable 'ansible_facts' from source: unknown 9385 1726776640.69670: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776640.6181347-9385-3201921594509/AnsiballZ_stat.py 9385 1726776640.69842: Sending initial data 9385 1726776640.69849: Sent initial data (149 bytes) 9385 1726776640.72687: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmp8f4eykm6 /root/.ansible/tmp/ansible-tmp-1726776640.6181347-9385-3201921594509/AnsiballZ_stat.py <<< 9385 1726776640.75150: stderr chunk (state=3): >>><<< 9385 1726776640.75157: stdout chunk (state=3): >>><<< 9385 1726776640.75178: done transferring module to remote 9385 1726776640.75190: _low_level_execute_command(): starting 9385 1726776640.75198: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776640.6181347-9385-3201921594509/ /root/.ansible/tmp/ansible-tmp-1726776640.6181347-9385-3201921594509/AnsiballZ_stat.py && sleep 0' 9385 1726776640.77757: stderr chunk (state=2): >>><<< 9385 1726776640.77765: stdout chunk (state=2): >>><<< 9385 1726776640.77780: _low_level_execute_command() done: rc=0, stdout=, stderr= 9385 1726776640.77784: _low_level_execute_command(): starting 9385 1726776640.77789: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776640.6181347-9385-3201921594509/AnsiballZ_stat.py && sleep 0' 9385 1726776640.93839: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 86, "inode": 515899586, "dev": 51713, "nlink": 1, "atime": 1726776631.5398922, "mtime": 1726776630.1968749, "ctime": 1726776630.4498782, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "mimetype": "text/plain", "charset": "us-ascii", "version": "1071231312", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 9385 1726776640.95022: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 9385 1726776640.95036: stdout chunk (state=3): >>><<< 9385 1726776640.95048: stderr chunk (state=3): >>><<< 9385 1726776640.95062: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 86, "inode": 515899586, "dev": 51713, "nlink": 1, "atime": 1726776631.5398922, "mtime": 1726776630.1968749, "ctime": 1726776630.4498782, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "mimetype": "text/plain", "charset": "us-ascii", "version": "1071231312", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.8.186 closed. 9385 1726776640.95132: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/kernel_settings/tuned.conf', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776640.6181347-9385-3201921594509/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9385 1726776640.95165: variable 'ansible_module_compression' from source: unknown 9385 1726776640.95210: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 9385 1726776640.95232: variable 'ansible_facts' from source: unknown 9385 1726776640.95330: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776640.6181347-9385-3201921594509/AnsiballZ_file.py 9385 1726776640.95810: Sending initial data 9385 1726776640.95817: Sent initial data (149 bytes) 9385 1726776640.99282: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpwn3ot73u /root/.ansible/tmp/ansible-tmp-1726776640.6181347-9385-3201921594509/AnsiballZ_file.py <<< 9385 1726776641.00656: stderr chunk (state=3): >>><<< 9385 1726776641.00666: stdout chunk (state=3): >>><<< 9385 1726776641.00690: done transferring module to remote 9385 1726776641.00701: _low_level_execute_command(): starting 9385 1726776641.00706: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776640.6181347-9385-3201921594509/ /root/.ansible/tmp/ansible-tmp-1726776640.6181347-9385-3201921594509/AnsiballZ_file.py && sleep 0' 9385 1726776641.03246: stderr chunk (state=2): >>><<< 9385 1726776641.03253: stdout chunk (state=2): >>><<< 9385 1726776641.03272: _low_level_execute_command() done: rc=0, stdout=, stderr= 9385 1726776641.03279: _low_level_execute_command(): starting 9385 1726776641.03285: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776640.6181347-9385-3201921594509/AnsiballZ_file.py && sleep 0' 9385 1726776641.19571: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings/tuned.conf", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings/tuned.conf"}, "after": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "invocation": {"module_args": {"mode": "0644", "dest": "/etc/tuned/kernel_settings/tuned.conf", "_original_basename": "kernel_settings.j2", "recurse": false, "state": "file", "path": "/etc/tuned/kernel_settings/tuned.conf", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 9385 1726776641.20702: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 9385 1726776641.20751: stderr chunk (state=3): >>><<< 9385 1726776641.20759: stdout chunk (state=3): >>><<< 9385 1726776641.20775: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings/tuned.conf", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings/tuned.conf"}, "after": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "invocation": {"module_args": {"mode": "0644", "dest": "/etc/tuned/kernel_settings/tuned.conf", "_original_basename": "kernel_settings.j2", "recurse": false, "state": "file", "path": "/etc/tuned/kernel_settings/tuned.conf", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.8.186 closed. 9385 1726776641.20806: done with _execute_module (ansible.legacy.file, {'mode': '0644', 'dest': '/etc/tuned/kernel_settings/tuned.conf', '_original_basename': 'kernel_settings.j2', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776640.6181347-9385-3201921594509/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9385 1726776641.20836: _low_level_execute_command(): starting 9385 1726776641.20843: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776640.6181347-9385-3201921594509/ > /dev/null 2>&1 && sleep 0' 9385 1726776641.23265: stderr chunk (state=2): >>><<< 9385 1726776641.23274: stdout chunk (state=2): >>><<< 9385 1726776641.23289: _low_level_execute_command() done: rc=0, stdout=, stderr= 9385 1726776641.23302: handler run complete 9385 1726776641.23323: attempt loop complete, returning result 9385 1726776641.23326: _execute() done 9385 1726776641.23332: dumping result to json 9385 1726776641.23337: done dumping result, returning 9385 1726776641.23345: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings [120fa90a-8a95-c4e4-06a7-000000000207] 9385 1726776641.23354: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000207 9385 1726776641.23402: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000207 9385 1726776641.23406: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "dest": "/etc/tuned/kernel_settings/tuned.conf", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "path": "/etc/tuned/kernel_settings/tuned.conf", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "state": "file", "uid": 0 } 8283 1726776641.23588: no more pending results, returning what we have 8283 1726776641.23591: results queue empty 8283 1726776641.23592: checking for any_errors_fatal 8283 1726776641.23600: done checking for any_errors_fatal 8283 1726776641.23601: checking for max_fail_percentage 8283 1726776641.23602: done checking for max_fail_percentage 8283 1726776641.23602: checking to see if all hosts have failed and the running result is not ok 8283 1726776641.23603: done checking to see if all hosts have failed 8283 1726776641.23604: getting the remaining hosts for this loop 8283 1726776641.23605: done getting the remaining hosts for this loop 8283 1726776641.23608: getting the next task for host managed_node3 8283 1726776641.23615: done getting next task for host managed_node3 8283 1726776641.23618: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 8283 1726776641.23621: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776641.23632: getting variables 8283 1726776641.23633: in VariableManager get_vars() 8283 1726776641.23664: Calling all_inventory to load vars for managed_node3 8283 1726776641.23666: Calling groups_inventory to load vars for managed_node3 8283 1726776641.23668: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776641.23676: Calling all_plugins_play to load vars for managed_node3 8283 1726776641.23679: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776641.23681: Calling groups_plugins_play to load vars for managed_node3 8283 1726776641.23717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776641.23753: done with get_vars() 8283 1726776641.23760: done getting variables 8283 1726776641.23801: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 Thursday 19 September 2024 16:10:41 -0400 (0:00:00.660) 0:00:24.942 **** 8283 1726776641.23823: entering _queue_task() for managed_node3/service 8283 1726776641.23997: worker is 1 (out of 1 available) 8283 1726776641.24011: exiting _queue_task() for managed_node3/service 8283 1726776641.24022: done queuing things up, now waiting for results queue to drain 8283 1726776641.24024: waiting for pending results... 9409 1726776641.24139: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 9409 1726776641.24250: in run() - task 120fa90a-8a95-c4e4-06a7-000000000208 9409 1726776641.24265: variable 'ansible_search_path' from source: unknown 9409 1726776641.24270: variable 'ansible_search_path' from source: unknown 9409 1726776641.24306: variable '__kernel_settings_services' from source: include_vars 9409 1726776641.24539: variable '__kernel_settings_services' from source: include_vars 9409 1726776641.24600: variable 'omit' from source: magic vars 9409 1726776641.24675: variable 'ansible_host' from source: host vars for 'managed_node3' 9409 1726776641.24686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9409 1726776641.24695: variable 'omit' from source: magic vars 9409 1726776641.24909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9409 1726776641.25100: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9409 1726776641.25136: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9409 1726776641.25162: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9409 1726776641.25188: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9409 1726776641.25262: variable '__kernel_settings_register_profile' from source: set_fact 9409 1726776641.25274: variable '__kernel_settings_register_mode' from source: set_fact 9409 1726776641.25289: Evaluated conditional (__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed): False 9409 1726776641.25294: when evaluation is False, skipping this task 9409 1726776641.25317: variable 'item' from source: unknown 9409 1726776641.25367: variable 'item' from source: unknown skipping: [managed_node3] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed", "item": "tuned", "skip_reason": "Conditional result was False" } 9409 1726776641.25394: dumping result to json 9409 1726776641.25400: done dumping result, returning 9409 1726776641.25406: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes [120fa90a-8a95-c4e4-06a7-000000000208] 9409 1726776641.25412: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000208 9409 1726776641.25437: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000208 9409 1726776641.25440: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false } MSG: All items skipped 8283 1726776641.25604: no more pending results, returning what we have 8283 1726776641.25607: results queue empty 8283 1726776641.25607: checking for any_errors_fatal 8283 1726776641.25618: done checking for any_errors_fatal 8283 1726776641.25618: checking for max_fail_percentage 8283 1726776641.25619: done checking for max_fail_percentage 8283 1726776641.25620: checking to see if all hosts have failed and the running result is not ok 8283 1726776641.25620: done checking to see if all hosts have failed 8283 1726776641.25621: getting the remaining hosts for this loop 8283 1726776641.25622: done getting the remaining hosts for this loop 8283 1726776641.25625: getting the next task for host managed_node3 8283 1726776641.25632: done getting next task for host managed_node3 8283 1726776641.25635: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 8283 1726776641.25638: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776641.25650: getting variables 8283 1726776641.25652: in VariableManager get_vars() 8283 1726776641.25681: Calling all_inventory to load vars for managed_node3 8283 1726776641.25684: Calling groups_inventory to load vars for managed_node3 8283 1726776641.25686: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776641.25693: Calling all_plugins_play to load vars for managed_node3 8283 1726776641.25698: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776641.25700: Calling groups_plugins_play to load vars for managed_node3 8283 1726776641.25738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776641.25770: done with get_vars() 8283 1726776641.25776: done getting variables 8283 1726776641.25815: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Tuned apply settings] ******** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 Thursday 19 September 2024 16:10:41 -0400 (0:00:00.020) 0:00:24.963 **** 8283 1726776641.25838: entering _queue_task() for managed_node3/command 8283 1726776641.25992: worker is 1 (out of 1 available) 8283 1726776641.26008: exiting _queue_task() for managed_node3/command 8283 1726776641.26019: done queuing things up, now waiting for results queue to drain 8283 1726776641.26020: waiting for pending results... 9410 1726776641.26221: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 9410 1726776641.26352: in run() - task 120fa90a-8a95-c4e4-06a7-000000000209 9410 1726776641.26367: variable 'ansible_search_path' from source: unknown 9410 1726776641.26371: variable 'ansible_search_path' from source: unknown 9410 1726776641.26404: calling self._execute() 9410 1726776641.26464: variable 'ansible_host' from source: host vars for 'managed_node3' 9410 1726776641.26474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9410 1726776641.26483: variable 'omit' from source: magic vars 9410 1726776641.26908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9410 1726776641.27175: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9410 1726776641.27216: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9410 1726776641.27246: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9410 1726776641.27277: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9410 1726776641.27391: variable '__kernel_settings_register_profile' from source: set_fact 9410 1726776641.27422: Evaluated conditional (not __kernel_settings_register_profile is changed): True 9410 1726776641.27552: variable '__kernel_settings_register_mode' from source: set_fact 9410 1726776641.27565: Evaluated conditional (not __kernel_settings_register_mode is changed): True 9410 1726776641.27675: variable '__kernel_settings_register_apply' from source: set_fact 9410 1726776641.27687: Evaluated conditional (__kernel_settings_register_apply is changed): False 9410 1726776641.27691: when evaluation is False, skipping this task 9410 1726776641.27697: _execute() done 9410 1726776641.27700: dumping result to json 9410 1726776641.27704: done dumping result, returning 9410 1726776641.27709: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings [120fa90a-8a95-c4e4-06a7-000000000209] 9410 1726776641.27715: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000209 9410 1726776641.27744: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000209 9410 1726776641.27747: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_register_apply is changed", "skip_reason": "Conditional result was False" } 8283 1726776641.28045: no more pending results, returning what we have 8283 1726776641.28048: results queue empty 8283 1726776641.28048: checking for any_errors_fatal 8283 1726776641.28055: done checking for any_errors_fatal 8283 1726776641.28056: checking for max_fail_percentage 8283 1726776641.28057: done checking for max_fail_percentage 8283 1726776641.28058: checking to see if all hosts have failed and the running result is not ok 8283 1726776641.28058: done checking to see if all hosts have failed 8283 1726776641.28059: getting the remaining hosts for this loop 8283 1726776641.28060: done getting the remaining hosts for this loop 8283 1726776641.28063: getting the next task for host managed_node3 8283 1726776641.28069: done getting next task for host managed_node3 8283 1726776641.28073: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings 8283 1726776641.28076: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776641.28089: getting variables 8283 1726776641.28090: in VariableManager get_vars() 8283 1726776641.28121: Calling all_inventory to load vars for managed_node3 8283 1726776641.28124: Calling groups_inventory to load vars for managed_node3 8283 1726776641.28126: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776641.28136: Calling all_plugins_play to load vars for managed_node3 8283 1726776641.28139: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776641.28142: Calling groups_plugins_play to load vars for managed_node3 8283 1726776641.28189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776641.28241: done with get_vars() 8283 1726776641.28249: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Verify settings] ************* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:166 Thursday 19 September 2024 16:10:41 -0400 (0:00:00.024) 0:00:24.988 **** 8283 1726776641.28339: entering _queue_task() for managed_node3/include_tasks 8283 1726776641.28519: worker is 1 (out of 1 available) 8283 1726776641.28532: exiting _queue_task() for managed_node3/include_tasks 8283 1726776641.28542: done queuing things up, now waiting for results queue to drain 8283 1726776641.28543: waiting for pending results... 9412 1726776641.28753: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Verify settings 9412 1726776641.28880: in run() - task 120fa90a-8a95-c4e4-06a7-00000000020a 9412 1726776641.28899: variable 'ansible_search_path' from source: unknown 9412 1726776641.28904: variable 'ansible_search_path' from source: unknown 9412 1726776641.28937: calling self._execute() 9412 1726776641.29003: variable 'ansible_host' from source: host vars for 'managed_node3' 9412 1726776641.29013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9412 1726776641.29022: variable 'omit' from source: magic vars 9412 1726776641.29453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9412 1726776641.29724: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9412 1726776641.29771: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9412 1726776641.29805: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9412 1726776641.29915: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9412 1726776641.30006: variable '__kernel_settings_register_apply' from source: set_fact 9412 1726776641.30028: Evaluated conditional (__kernel_settings_register_apply is changed): False 9412 1726776641.30033: when evaluation is False, skipping this task 9412 1726776641.30036: _execute() done 9412 1726776641.30039: dumping result to json 9412 1726776641.30042: done dumping result, returning 9412 1726776641.30046: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Verify settings [120fa90a-8a95-c4e4-06a7-00000000020a] 9412 1726776641.30051: sending task result for task 120fa90a-8a95-c4e4-06a7-00000000020a 9412 1726776641.30073: done sending task result for task 120fa90a-8a95-c4e4-06a7-00000000020a 9412 1726776641.30076: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__kernel_settings_register_apply is changed", "skip_reason": "Conditional result was False" } 8283 1726776641.30344: no more pending results, returning what we have 8283 1726776641.30346: results queue empty 8283 1726776641.30347: checking for any_errors_fatal 8283 1726776641.30351: done checking for any_errors_fatal 8283 1726776641.30352: checking for max_fail_percentage 8283 1726776641.30353: done checking for max_fail_percentage 8283 1726776641.30354: checking to see if all hosts have failed and the running result is not ok 8283 1726776641.30354: done checking to see if all hosts have failed 8283 1726776641.30355: getting the remaining hosts for this loop 8283 1726776641.30356: done getting the remaining hosts for this loop 8283 1726776641.30359: getting the next task for host managed_node3 8283 1726776641.30364: done getting next task for host managed_node3 8283 1726776641.30366: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 8283 1726776641.30370: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776641.30380: getting variables 8283 1726776641.30382: in VariableManager get_vars() 8283 1726776641.30410: Calling all_inventory to load vars for managed_node3 8283 1726776641.30412: Calling groups_inventory to load vars for managed_node3 8283 1726776641.30414: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776641.30421: Calling all_plugins_play to load vars for managed_node3 8283 1726776641.30423: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776641.30426: Calling groups_plugins_play to load vars for managed_node3 8283 1726776641.30469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776641.30512: done with get_vars() 8283 1726776641.30519: done getting variables 8283 1726776641.30566: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:177 Thursday 19 September 2024 16:10:41 -0400 (0:00:00.022) 0:00:25.010 **** 8283 1726776641.30592: entering _queue_task() for managed_node3/set_fact 8283 1726776641.30755: worker is 1 (out of 1 available) 8283 1726776641.30770: exiting _queue_task() for managed_node3/set_fact 8283 1726776641.30780: done queuing things up, now waiting for results queue to drain 8283 1726776641.30782: waiting for pending results... 9413 1726776641.31035: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 9413 1726776641.31163: in run() - task 120fa90a-8a95-c4e4-06a7-00000000020b 9413 1726776641.31180: variable 'ansible_search_path' from source: unknown 9413 1726776641.31185: variable 'ansible_search_path' from source: unknown 9413 1726776641.31218: calling self._execute() 9413 1726776641.31279: variable 'ansible_host' from source: host vars for 'managed_node3' 9413 1726776641.31289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9413 1726776641.31302: variable 'omit' from source: magic vars 9413 1726776641.31392: variable 'omit' from source: magic vars 9413 1726776641.31450: variable 'omit' from source: magic vars 9413 1726776641.31480: variable 'omit' from source: magic vars 9413 1726776641.31521: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9413 1726776641.31556: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9413 1726776641.31575: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9413 1726776641.31592: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9413 1726776641.31607: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9413 1726776641.31635: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9413 1726776641.31641: variable 'ansible_host' from source: host vars for 'managed_node3' 9413 1726776641.31646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9413 1726776641.31741: Set connection var ansible_module_compression to ZIP_DEFLATED 9413 1726776641.31751: Set connection var ansible_shell_type to sh 9413 1726776641.31758: Set connection var ansible_timeout to 10 9413 1726776641.31763: Set connection var ansible_connection to ssh 9413 1726776641.31771: Set connection var ansible_pipelining to False 9413 1726776641.31776: Set connection var ansible_shell_executable to /bin/sh 9413 1726776641.31798: variable 'ansible_shell_executable' from source: unknown 9413 1726776641.31803: variable 'ansible_connection' from source: unknown 9413 1726776641.31807: variable 'ansible_module_compression' from source: unknown 9413 1726776641.31810: variable 'ansible_shell_type' from source: unknown 9413 1726776641.31812: variable 'ansible_shell_executable' from source: unknown 9413 1726776641.31815: variable 'ansible_host' from source: host vars for 'managed_node3' 9413 1726776641.31818: variable 'ansible_pipelining' from source: unknown 9413 1726776641.31821: variable 'ansible_timeout' from source: unknown 9413 1726776641.31824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9413 1726776641.32025: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9413 1726776641.32043: variable 'omit' from source: magic vars 9413 1726776641.32048: starting attempt loop 9413 1726776641.32052: running the handler 9413 1726776641.32062: handler run complete 9413 1726776641.32072: attempt loop complete, returning result 9413 1726776641.32076: _execute() done 9413 1726776641.32079: dumping result to json 9413 1726776641.32082: done dumping result, returning 9413 1726776641.32088: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes [120fa90a-8a95-c4e4-06a7-00000000020b] 9413 1726776641.32097: sending task result for task 120fa90a-8a95-c4e4-06a7-00000000020b 9413 1726776641.32121: done sending task result for task 120fa90a-8a95-c4e4-06a7-00000000020b 9413 1726776641.32125: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "kernel_settings_reboot_required": false }, "changed": false } 8283 1726776641.32423: no more pending results, returning what we have 8283 1726776641.32426: results queue empty 8283 1726776641.32427: checking for any_errors_fatal 8283 1726776641.32433: done checking for any_errors_fatal 8283 1726776641.32433: checking for max_fail_percentage 8283 1726776641.32435: done checking for max_fail_percentage 8283 1726776641.32435: checking to see if all hosts have failed and the running result is not ok 8283 1726776641.32436: done checking to see if all hosts have failed 8283 1726776641.32436: getting the remaining hosts for this loop 8283 1726776641.32437: done getting the remaining hosts for this loop 8283 1726776641.32440: getting the next task for host managed_node3 8283 1726776641.32446: done getting next task for host managed_node3 8283 1726776641.32449: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 8283 1726776641.32452: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776641.32461: getting variables 8283 1726776641.32463: in VariableManager get_vars() 8283 1726776641.32492: Calling all_inventory to load vars for managed_node3 8283 1726776641.32498: Calling groups_inventory to load vars for managed_node3 8283 1726776641.32500: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776641.32508: Calling all_plugins_play to load vars for managed_node3 8283 1726776641.32511: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776641.32514: Calling groups_plugins_play to load vars for managed_node3 8283 1726776641.32563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776641.32613: done with get_vars() 8283 1726776641.32620: done getting variables 8283 1726776641.32671: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:181 Thursday 19 September 2024 16:10:41 -0400 (0:00:00.021) 0:00:25.031 **** 8283 1726776641.32704: entering _queue_task() for managed_node3/set_fact 8283 1726776641.32887: worker is 1 (out of 1 available) 8283 1726776641.32902: exiting _queue_task() for managed_node3/set_fact 8283 1726776641.32914: done queuing things up, now waiting for results queue to drain 8283 1726776641.32916: waiting for pending results... 9414 1726776641.33127: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 9414 1726776641.33264: in run() - task 120fa90a-8a95-c4e4-06a7-00000000020c 9414 1726776641.33282: variable 'ansible_search_path' from source: unknown 9414 1726776641.33286: variable 'ansible_search_path' from source: unknown 9414 1726776641.33319: calling self._execute() 9414 1726776641.33384: variable 'ansible_host' from source: host vars for 'managed_node3' 9414 1726776641.33396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9414 1726776641.33405: variable 'omit' from source: magic vars 9414 1726776641.33493: variable 'omit' from source: magic vars 9414 1726776641.33546: variable 'omit' from source: magic vars 9414 1726776641.33854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9414 1726776641.34122: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9414 1726776641.34157: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9414 1726776641.34182: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9414 1726776641.34210: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9414 1726776641.34326: variable '__kernel_settings_register_profile' from source: set_fact 9414 1726776641.34341: variable '__kernel_settings_register_mode' from source: set_fact 9414 1726776641.34348: variable '__kernel_settings_register_apply' from source: set_fact 9414 1726776641.34383: variable 'omit' from source: magic vars 9414 1726776641.34408: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9414 1726776641.34436: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9414 1726776641.34452: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9414 1726776641.34466: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9414 1726776641.34476: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9414 1726776641.34500: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9414 1726776641.34504: variable 'ansible_host' from source: host vars for 'managed_node3' 9414 1726776641.34508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9414 1726776641.34575: Set connection var ansible_module_compression to ZIP_DEFLATED 9414 1726776641.34582: Set connection var ansible_shell_type to sh 9414 1726776641.34589: Set connection var ansible_timeout to 10 9414 1726776641.34596: Set connection var ansible_connection to ssh 9414 1726776641.34603: Set connection var ansible_pipelining to False 9414 1726776641.34608: Set connection var ansible_shell_executable to /bin/sh 9414 1726776641.34625: variable 'ansible_shell_executable' from source: unknown 9414 1726776641.34630: variable 'ansible_connection' from source: unknown 9414 1726776641.34634: variable 'ansible_module_compression' from source: unknown 9414 1726776641.34638: variable 'ansible_shell_type' from source: unknown 9414 1726776641.34641: variable 'ansible_shell_executable' from source: unknown 9414 1726776641.34645: variable 'ansible_host' from source: host vars for 'managed_node3' 9414 1726776641.34649: variable 'ansible_pipelining' from source: unknown 9414 1726776641.34652: variable 'ansible_timeout' from source: unknown 9414 1726776641.34656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9414 1726776641.34722: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9414 1726776641.34734: variable 'omit' from source: magic vars 9414 1726776641.34740: starting attempt loop 9414 1726776641.34743: running the handler 9414 1726776641.34752: handler run complete 9414 1726776641.34759: attempt loop complete, returning result 9414 1726776641.34762: _execute() done 9414 1726776641.34765: dumping result to json 9414 1726776641.34768: done dumping result, returning 9414 1726776641.34774: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing [120fa90a-8a95-c4e4-06a7-00000000020c] 9414 1726776641.34780: sending task result for task 120fa90a-8a95-c4e4-06a7-00000000020c 9414 1726776641.34801: done sending task result for task 120fa90a-8a95-c4e4-06a7-00000000020c 9414 1726776641.34804: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__kernel_settings_changed": false }, "changed": false } 8283 1726776641.34911: no more pending results, returning what we have 8283 1726776641.34913: results queue empty 8283 1726776641.34914: checking for any_errors_fatal 8283 1726776641.34918: done checking for any_errors_fatal 8283 1726776641.34918: checking for max_fail_percentage 8283 1726776641.34920: done checking for max_fail_percentage 8283 1726776641.34921: checking to see if all hosts have failed and the running result is not ok 8283 1726776641.34921: done checking to see if all hosts have failed 8283 1726776641.34922: getting the remaining hosts for this loop 8283 1726776641.34923: done getting the remaining hosts for this loop 8283 1726776641.34925: getting the next task for host managed_node3 8283 1726776641.34934: done getting next task for host managed_node3 8283 1726776641.34936: ^ task is: TASK: meta (role_complete) 8283 1726776641.34939: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776641.34947: getting variables 8283 1726776641.34948: in VariableManager get_vars() 8283 1726776641.34977: Calling all_inventory to load vars for managed_node3 8283 1726776641.34980: Calling groups_inventory to load vars for managed_node3 8283 1726776641.34981: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776641.34989: Calling all_plugins_play to load vars for managed_node3 8283 1726776641.34991: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776641.34993: Calling groups_plugins_play to load vars for managed_node3 8283 1726776641.35029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776641.35062: done with get_vars() 8283 1726776641.35067: done getting variables 8283 1726776641.35116: done queuing things up, now waiting for results queue to drain 8283 1726776641.35121: results queue empty 8283 1726776641.35121: checking for any_errors_fatal 8283 1726776641.35124: done checking for any_errors_fatal 8283 1726776641.35124: checking for max_fail_percentage 8283 1726776641.35125: done checking for max_fail_percentage 8283 1726776641.35125: checking to see if all hosts have failed and the running result is not ok 8283 1726776641.35126: done checking to see if all hosts have failed 8283 1726776641.35126: getting the remaining hosts for this loop 8283 1726776641.35126: done getting the remaining hosts for this loop 8283 1726776641.35128: getting the next task for host managed_node3 8283 1726776641.35132: done getting next task for host managed_node3 8283 1726776641.35133: ^ task is: TASK: Verify no settings 8283 1726776641.35134: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776641.35136: getting variables 8283 1726776641.35136: in VariableManager get_vars() 8283 1726776641.35144: Calling all_inventory to load vars for managed_node3 8283 1726776641.35145: Calling groups_inventory to load vars for managed_node3 8283 1726776641.35147: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776641.35149: Calling all_plugins_play to load vars for managed_node3 8283 1726776641.35151: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776641.35152: Calling groups_plugins_play to load vars for managed_node3 8283 1726776641.35172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776641.35188: done with get_vars() 8283 1726776641.35192: done getting variables 8283 1726776641.35217: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify no settings] ****************************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:20 Thursday 19 September 2024 16:10:41 -0400 (0:00:00.025) 0:00:25.057 **** 8283 1726776641.35237: entering _queue_task() for managed_node3/shell 8283 1726776641.35380: worker is 1 (out of 1 available) 8283 1726776641.35393: exiting _queue_task() for managed_node3/shell 8283 1726776641.35403: done queuing things up, now waiting for results queue to drain 8283 1726776641.35405: waiting for pending results... 9416 1726776641.35513: running TaskExecutor() for managed_node3/TASK: Verify no settings 9416 1726776641.35603: in run() - task 120fa90a-8a95-c4e4-06a7-000000000154 9416 1726776641.35617: variable 'ansible_search_path' from source: unknown 9416 1726776641.35621: variable 'ansible_search_path' from source: unknown 9416 1726776641.35649: calling self._execute() 9416 1726776641.35785: variable 'ansible_host' from source: host vars for 'managed_node3' 9416 1726776641.35793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9416 1726776641.35804: variable 'omit' from source: magic vars 9416 1726776641.35872: variable 'omit' from source: magic vars 9416 1726776641.35903: variable 'omit' from source: magic vars 9416 1726776641.36169: variable '__kernel_settings_profile_filename' from source: role '' exported vars 9416 1726776641.36241: variable '__kernel_settings_profile_dir' from source: role '' exported vars 9416 1726776641.36304: variable '__kernel_settings_profile_parent' from source: set_fact 9416 1726776641.36313: variable '__kernel_settings_tuned_profile' from source: role '' exported vars 9416 1726776641.36345: variable 'omit' from source: magic vars 9416 1726776641.36377: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9416 1726776641.36404: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9416 1726776641.36421: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9416 1726776641.36439: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9416 1726776641.36452: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9416 1726776641.36478: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9416 1726776641.36484: variable 'ansible_host' from source: host vars for 'managed_node3' 9416 1726776641.36489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9416 1726776641.36585: Set connection var ansible_module_compression to ZIP_DEFLATED 9416 1726776641.36593: Set connection var ansible_shell_type to sh 9416 1726776641.36602: Set connection var ansible_timeout to 10 9416 1726776641.36607: Set connection var ansible_connection to ssh 9416 1726776641.36615: Set connection var ansible_pipelining to False 9416 1726776641.36620: Set connection var ansible_shell_executable to /bin/sh 9416 1726776641.36639: variable 'ansible_shell_executable' from source: unknown 9416 1726776641.36643: variable 'ansible_connection' from source: unknown 9416 1726776641.36646: variable 'ansible_module_compression' from source: unknown 9416 1726776641.36648: variable 'ansible_shell_type' from source: unknown 9416 1726776641.36651: variable 'ansible_shell_executable' from source: unknown 9416 1726776641.36653: variable 'ansible_host' from source: host vars for 'managed_node3' 9416 1726776641.36657: variable 'ansible_pipelining' from source: unknown 9416 1726776641.36659: variable 'ansible_timeout' from source: unknown 9416 1726776641.36662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9416 1726776641.36771: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9416 1726776641.36782: variable 'omit' from source: magic vars 9416 1726776641.36787: starting attempt loop 9416 1726776641.36789: running the handler 9416 1726776641.36800: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9416 1726776641.36814: _low_level_execute_command(): starting 9416 1726776641.36822: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9416 1726776641.39147: stdout chunk (state=2): >>>/root <<< 9416 1726776641.39293: stderr chunk (state=3): >>><<< 9416 1726776641.39304: stdout chunk (state=3): >>><<< 9416 1726776641.39324: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9416 1726776641.39340: _low_level_execute_command(): starting 9416 1726776641.39347: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776641.39334-9416-212637036157536 `" && echo ansible-tmp-1726776641.39334-9416-212637036157536="` echo /root/.ansible/tmp/ansible-tmp-1726776641.39334-9416-212637036157536 `" ) && sleep 0' 9416 1726776641.42164: stdout chunk (state=2): >>>ansible-tmp-1726776641.39334-9416-212637036157536=/root/.ansible/tmp/ansible-tmp-1726776641.39334-9416-212637036157536 <<< 9416 1726776641.42175: stderr chunk (state=2): >>><<< 9416 1726776641.42186: stdout chunk (state=3): >>><<< 9416 1726776641.42202: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776641.39334-9416-212637036157536=/root/.ansible/tmp/ansible-tmp-1726776641.39334-9416-212637036157536 , stderr= 9416 1726776641.42233: variable 'ansible_module_compression' from source: unknown 9416 1726776641.42284: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 9416 1726776641.42319: variable 'ansible_facts' from source: unknown 9416 1726776641.42421: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776641.39334-9416-212637036157536/AnsiballZ_command.py 9416 1726776641.42967: Sending initial data 9416 1726776641.42974: Sent initial data (152 bytes) 9416 1726776641.46587: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmp_5j7_j1w /root/.ansible/tmp/ansible-tmp-1726776641.39334-9416-212637036157536/AnsiballZ_command.py <<< 9416 1726776641.47659: stderr chunk (state=3): >>><<< 9416 1726776641.47667: stdout chunk (state=3): >>><<< 9416 1726776641.47686: done transferring module to remote 9416 1726776641.47697: _low_level_execute_command(): starting 9416 1726776641.47701: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776641.39334-9416-212637036157536/ /root/.ansible/tmp/ansible-tmp-1726776641.39334-9416-212637036157536/AnsiballZ_command.py && sleep 0' 9416 1726776641.50037: stderr chunk (state=2): >>><<< 9416 1726776641.50044: stdout chunk (state=2): >>><<< 9416 1726776641.50059: _low_level_execute_command() done: rc=0, stdout=, stderr= 9416 1726776641.50064: _low_level_execute_command(): starting 9416 1726776641.50069: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776641.39334-9416-212637036157536/AnsiballZ_command.py && sleep 0' 9416 1726776641.65995: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ conf=/etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysctl\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysfs\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[systemd\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[vm\\]' /etc/tuned/kernel_settings/tuned.conf\n+ exit 0", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "start": "2024-09-19 16:10:41.651222", "end": "2024-09-19 16:10:41.658790", "delta": "0:00:00.007568", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 9416 1726776641.67201: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 9416 1726776641.67244: stderr chunk (state=3): >>><<< 9416 1726776641.67251: stdout chunk (state=3): >>><<< 9416 1726776641.67268: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ conf=/etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysctl\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysfs\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[systemd\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[vm\\]' /etc/tuned/kernel_settings/tuned.conf\n+ exit 0", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "start": "2024-09-19 16:10:41.651222", "end": "2024-09-19 16:10:41.658790", "delta": "0:00:00.007568", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.8.186 closed. 9416 1726776641.67302: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\["$section"\\\\] "$conf"; then\n echo ERROR: "$section" settings present\n rc=1\n fi\ndone\nexit "$rc"\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776641.39334-9416-212637036157536/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9416 1726776641.67311: _low_level_execute_command(): starting 9416 1726776641.67318: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776641.39334-9416-212637036157536/ > /dev/null 2>&1 && sleep 0' 9416 1726776641.69709: stderr chunk (state=2): >>><<< 9416 1726776641.69718: stdout chunk (state=2): >>><<< 9416 1726776641.69735: _low_level_execute_command() done: rc=0, stdout=, stderr= 9416 1726776641.69742: handler run complete 9416 1726776641.69760: Evaluated conditional (False): False 9416 1726776641.69767: attempt loop complete, returning result 9416 1726776641.69770: _execute() done 9416 1726776641.69772: dumping result to json 9416 1726776641.69775: done dumping result, returning 9416 1726776641.69780: done running TaskExecutor() for managed_node3/TASK: Verify no settings [120fa90a-8a95-c4e4-06a7-000000000154] 9416 1726776641.69785: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000154 9416 1726776641.69818: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000154 9416 1726776641.69820: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "delta": "0:00:00.007568", "end": "2024-09-19 16:10:41.658790", "rc": 0, "start": "2024-09-19 16:10:41.651222" } STDERR: + exec + rc=0 + conf=/etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[sysctl\]' /etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[sysfs\]' /etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[systemd\]' /etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[vm\]' /etc/tuned/kernel_settings/tuned.conf + exit 0 8283 1726776641.70072: no more pending results, returning what we have 8283 1726776641.70074: results queue empty 8283 1726776641.70074: checking for any_errors_fatal 8283 1726776641.70075: done checking for any_errors_fatal 8283 1726776641.70076: checking for max_fail_percentage 8283 1726776641.70077: done checking for max_fail_percentage 8283 1726776641.70077: checking to see if all hosts have failed and the running result is not ok 8283 1726776641.70077: done checking to see if all hosts have failed 8283 1726776641.70078: getting the remaining hosts for this loop 8283 1726776641.70078: done getting the remaining hosts for this loop 8283 1726776641.70081: getting the next task for host managed_node3 8283 1726776641.70085: done getting next task for host managed_node3 8283 1726776641.70086: ^ task is: TASK: Remove kernel_settings tuned profile 8283 1726776641.70088: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776641.70090: getting variables 8283 1726776641.70091: in VariableManager get_vars() 8283 1726776641.70110: Calling all_inventory to load vars for managed_node3 8283 1726776641.70112: Calling groups_inventory to load vars for managed_node3 8283 1726776641.70113: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776641.70120: Calling all_plugins_play to load vars for managed_node3 8283 1726776641.70125: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776641.70127: Calling groups_plugins_play to load vars for managed_node3 8283 1726776641.70163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776641.70185: done with get_vars() 8283 1726776641.70191: done getting variables TASK [Remove kernel_settings tuned profile] ************************************ task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:36 Thursday 19 September 2024 16:10:41 -0400 (0:00:00.350) 0:00:25.407 **** 8283 1726776641.70254: entering _queue_task() for managed_node3/file 8283 1726776641.70404: worker is 1 (out of 1 available) 8283 1726776641.70419: exiting _queue_task() for managed_node3/file 8283 1726776641.70432: done queuing things up, now waiting for results queue to drain 8283 1726776641.70434: waiting for pending results... 9439 1726776641.70567: running TaskExecutor() for managed_node3/TASK: Remove kernel_settings tuned profile 9439 1726776641.70683: in run() - task 120fa90a-8a95-c4e4-06a7-000000000155 9439 1726776641.70702: variable 'ansible_search_path' from source: unknown 9439 1726776641.70707: variable 'ansible_search_path' from source: unknown 9439 1726776641.70740: calling self._execute() 9439 1726776641.70806: variable 'ansible_host' from source: host vars for 'managed_node3' 9439 1726776641.70815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9439 1726776641.70822: variable 'omit' from source: magic vars 9439 1726776641.70920: variable 'omit' from source: magic vars 9439 1726776641.70964: variable 'omit' from source: magic vars 9439 1726776641.70992: variable '__kernel_settings_profile_dir' from source: role '' exported vars 9439 1726776641.71279: variable '__kernel_settings_profile_dir' from source: role '' exported vars 9439 1726776641.71375: variable '__kernel_settings_profile_parent' from source: set_fact 9439 1726776641.71383: variable '__kernel_settings_tuned_profile' from source: role '' exported vars 9439 1726776641.71426: variable 'omit' from source: magic vars 9439 1726776641.71479: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9439 1726776641.71515: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9439 1726776641.71538: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9439 1726776641.71554: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9439 1726776641.71566: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9439 1726776641.71592: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9439 1726776641.71600: variable 'ansible_host' from source: host vars for 'managed_node3' 9439 1726776641.71604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9439 1726776641.71693: Set connection var ansible_module_compression to ZIP_DEFLATED 9439 1726776641.71704: Set connection var ansible_shell_type to sh 9439 1726776641.71710: Set connection var ansible_timeout to 10 9439 1726776641.71715: Set connection var ansible_connection to ssh 9439 1726776641.71722: Set connection var ansible_pipelining to False 9439 1726776641.71727: Set connection var ansible_shell_executable to /bin/sh 9439 1726776641.71747: variable 'ansible_shell_executable' from source: unknown 9439 1726776641.71752: variable 'ansible_connection' from source: unknown 9439 1726776641.71756: variable 'ansible_module_compression' from source: unknown 9439 1726776641.71759: variable 'ansible_shell_type' from source: unknown 9439 1726776641.71762: variable 'ansible_shell_executable' from source: unknown 9439 1726776641.71765: variable 'ansible_host' from source: host vars for 'managed_node3' 9439 1726776641.71768: variable 'ansible_pipelining' from source: unknown 9439 1726776641.71771: variable 'ansible_timeout' from source: unknown 9439 1726776641.71775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9439 1726776641.71956: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9439 1726776641.71967: variable 'omit' from source: magic vars 9439 1726776641.71973: starting attempt loop 9439 1726776641.71976: running the handler 9439 1726776641.71988: _low_level_execute_command(): starting 9439 1726776641.71999: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9439 1726776641.74265: stdout chunk (state=2): >>>/root <<< 9439 1726776641.74556: stderr chunk (state=3): >>><<< 9439 1726776641.74567: stdout chunk (state=3): >>><<< 9439 1726776641.74591: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9439 1726776641.74607: _low_level_execute_command(): starting 9439 1726776641.74615: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776641.7459924-9439-262075858231802 `" && echo ansible-tmp-1726776641.7459924-9439-262075858231802="` echo /root/.ansible/tmp/ansible-tmp-1726776641.7459924-9439-262075858231802 `" ) && sleep 0' 9439 1726776641.77247: stdout chunk (state=2): >>>ansible-tmp-1726776641.7459924-9439-262075858231802=/root/.ansible/tmp/ansible-tmp-1726776641.7459924-9439-262075858231802 <<< 9439 1726776641.77690: stderr chunk (state=3): >>><<< 9439 1726776641.77703: stdout chunk (state=3): >>><<< 9439 1726776641.77720: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776641.7459924-9439-262075858231802=/root/.ansible/tmp/ansible-tmp-1726776641.7459924-9439-262075858231802 , stderr= 9439 1726776641.77762: variable 'ansible_module_compression' from source: unknown 9439 1726776641.77802: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 9439 1726776641.77833: variable 'ansible_facts' from source: unknown 9439 1726776641.77905: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776641.7459924-9439-262075858231802/AnsiballZ_file.py 9439 1726776641.78001: Sending initial data 9439 1726776641.78009: Sent initial data (151 bytes) 9439 1726776641.80480: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpdyb50ksz /root/.ansible/tmp/ansible-tmp-1726776641.7459924-9439-262075858231802/AnsiballZ_file.py <<< 9439 1726776641.81481: stderr chunk (state=3): >>><<< 9439 1726776641.81487: stdout chunk (state=3): >>><<< 9439 1726776641.81508: done transferring module to remote 9439 1726776641.81518: _low_level_execute_command(): starting 9439 1726776641.81523: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776641.7459924-9439-262075858231802/ /root/.ansible/tmp/ansible-tmp-1726776641.7459924-9439-262075858231802/AnsiballZ_file.py && sleep 0' 9439 1726776641.83841: stderr chunk (state=2): >>><<< 9439 1726776641.83847: stdout chunk (state=2): >>><<< 9439 1726776641.83859: _low_level_execute_command() done: rc=0, stdout=, stderr= 9439 1726776641.83862: _low_level_execute_command(): starting 9439 1726776641.83866: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776641.7459924-9439-262075858231802/AnsiballZ_file.py && sleep 0' 9439 1726776642.00237: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "directory", "path_content": {"directories": [], "files": ["/etc/tuned/kernel_settings/tuned.conf"]}}, "after": {"path": "/etc/tuned/kernel_settings", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "absent", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 9439 1726776642.01422: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 9439 1726776642.01436: stdout chunk (state=3): >>><<< 9439 1726776642.01448: stderr chunk (state=3): >>><<< 9439 1726776642.01464: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "directory", "path_content": {"directories": [], "files": ["/etc/tuned/kernel_settings/tuned.conf"]}}, "after": {"path": "/etc/tuned/kernel_settings", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "absent", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.8.186 closed. 9439 1726776642.01511: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'absent', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776641.7459924-9439-262075858231802/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9439 1726776642.01525: _low_level_execute_command(): starting 9439 1726776642.01533: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776641.7459924-9439-262075858231802/ > /dev/null 2>&1 && sleep 0' 9439 1726776642.04445: stderr chunk (state=2): >>><<< 9439 1726776642.04453: stdout chunk (state=2): >>><<< 9439 1726776642.04467: _low_level_execute_command() done: rc=0, stdout=, stderr= 9439 1726776642.04473: handler run complete 9439 1726776642.04494: attempt loop complete, returning result 9439 1726776642.04500: _execute() done 9439 1726776642.04504: dumping result to json 9439 1726776642.04510: done dumping result, returning 9439 1726776642.04519: done running TaskExecutor() for managed_node3/TASK: Remove kernel_settings tuned profile [120fa90a-8a95-c4e4-06a7-000000000155] 9439 1726776642.04525: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000155 9439 1726776642.04557: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000155 9439 1726776642.04561: WORKER PROCESS EXITING changed: [managed_node3] => { "changed": true, "path": "/etc/tuned/kernel_settings", "state": "absent" } 8283 1726776642.04781: no more pending results, returning what we have 8283 1726776642.04783: results queue empty 8283 1726776642.04783: checking for any_errors_fatal 8283 1726776642.04791: done checking for any_errors_fatal 8283 1726776642.04792: checking for max_fail_percentage 8283 1726776642.04792: done checking for max_fail_percentage 8283 1726776642.04793: checking to see if all hosts have failed and the running result is not ok 8283 1726776642.04793: done checking to see if all hosts have failed 8283 1726776642.04794: getting the remaining hosts for this loop 8283 1726776642.04794: done getting the remaining hosts for this loop 8283 1726776642.04797: getting the next task for host managed_node3 8283 1726776642.04802: done getting next task for host managed_node3 8283 1726776642.04803: ^ task is: TASK: Get active_profile 8283 1726776642.04805: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776642.04807: getting variables 8283 1726776642.04808: in VariableManager get_vars() 8283 1726776642.04833: Calling all_inventory to load vars for managed_node3 8283 1726776642.04835: Calling groups_inventory to load vars for managed_node3 8283 1726776642.04837: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776642.04843: Calling all_plugins_play to load vars for managed_node3 8283 1726776642.04845: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776642.04846: Calling groups_plugins_play to load vars for managed_node3 8283 1726776642.04882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776642.04907: done with get_vars() 8283 1726776642.04913: done getting variables TASK [Get active_profile] ****************************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:41 Thursday 19 September 2024 16:10:42 -0400 (0:00:00.347) 0:00:25.754 **** 8283 1726776642.04978: entering _queue_task() for managed_node3/slurp 8283 1726776642.05139: worker is 1 (out of 1 available) 8283 1726776642.05153: exiting _queue_task() for managed_node3/slurp 8283 1726776642.05165: done queuing things up, now waiting for results queue to drain 8283 1726776642.05167: waiting for pending results... 9455 1726776642.05294: running TaskExecutor() for managed_node3/TASK: Get active_profile 9455 1726776642.05394: in run() - task 120fa90a-8a95-c4e4-06a7-000000000156 9455 1726776642.05409: variable 'ansible_search_path' from source: unknown 9455 1726776642.05414: variable 'ansible_search_path' from source: unknown 9455 1726776642.05445: calling self._execute() 9455 1726776642.05509: variable 'ansible_host' from source: host vars for 'managed_node3' 9455 1726776642.05518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9455 1726776642.05528: variable 'omit' from source: magic vars 9455 1726776642.05622: variable 'omit' from source: magic vars 9455 1726776642.05656: variable 'omit' from source: magic vars 9455 1726776642.05676: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars 9455 1726776642.05889: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars 9455 1726776642.05950: variable '__kernel_settings_tuned_dir' from source: role '' exported vars 9455 1726776642.05977: variable 'omit' from source: magic vars 9455 1726776642.06008: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9455 1726776642.06034: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9455 1726776642.06054: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9455 1726776642.06072: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9455 1726776642.06086: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9455 1726776642.06117: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9455 1726776642.06123: variable 'ansible_host' from source: host vars for 'managed_node3' 9455 1726776642.06128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9455 1726776642.06248: Set connection var ansible_module_compression to ZIP_DEFLATED 9455 1726776642.06257: Set connection var ansible_shell_type to sh 9455 1726776642.06264: Set connection var ansible_timeout to 10 9455 1726776642.06269: Set connection var ansible_connection to ssh 9455 1726776642.06276: Set connection var ansible_pipelining to False 9455 1726776642.06281: Set connection var ansible_shell_executable to /bin/sh 9455 1726776642.06301: variable 'ansible_shell_executable' from source: unknown 9455 1726776642.06305: variable 'ansible_connection' from source: unknown 9455 1726776642.06308: variable 'ansible_module_compression' from source: unknown 9455 1726776642.06310: variable 'ansible_shell_type' from source: unknown 9455 1726776642.06313: variable 'ansible_shell_executable' from source: unknown 9455 1726776642.06315: variable 'ansible_host' from source: host vars for 'managed_node3' 9455 1726776642.06318: variable 'ansible_pipelining' from source: unknown 9455 1726776642.06321: variable 'ansible_timeout' from source: unknown 9455 1726776642.06324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9455 1726776642.06505: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9455 1726776642.06517: variable 'omit' from source: magic vars 9455 1726776642.06522: starting attempt loop 9455 1726776642.06527: running the handler 9455 1726776642.06540: _low_level_execute_command(): starting 9455 1726776642.06548: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9455 1726776642.09034: stdout chunk (state=2): >>>/root <<< 9455 1726776642.09044: stderr chunk (state=2): >>><<< 9455 1726776642.09055: stdout chunk (state=3): >>><<< 9455 1726776642.09071: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9455 1726776642.09083: _low_level_execute_command(): starting 9455 1726776642.09089: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776642.0907795-9455-163647959214981 `" && echo ansible-tmp-1726776642.0907795-9455-163647959214981="` echo /root/.ansible/tmp/ansible-tmp-1726776642.0907795-9455-163647959214981 `" ) && sleep 0' 9455 1726776642.12042: stdout chunk (state=2): >>>ansible-tmp-1726776642.0907795-9455-163647959214981=/root/.ansible/tmp/ansible-tmp-1726776642.0907795-9455-163647959214981 <<< 9455 1726776642.12188: stderr chunk (state=3): >>><<< 9455 1726776642.12196: stdout chunk (state=3): >>><<< 9455 1726776642.12217: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776642.0907795-9455-163647959214981=/root/.ansible/tmp/ansible-tmp-1726776642.0907795-9455-163647959214981 , stderr= 9455 1726776642.12263: variable 'ansible_module_compression' from source: unknown 9455 1726776642.12305: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.slurp-ZIP_DEFLATED 9455 1726776642.12341: variable 'ansible_facts' from source: unknown 9455 1726776642.12422: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776642.0907795-9455-163647959214981/AnsiballZ_slurp.py 9455 1726776642.13185: Sending initial data 9455 1726776642.13193: Sent initial data (152 bytes) 9455 1726776642.15609: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpifz9xzqv /root/.ansible/tmp/ansible-tmp-1726776642.0907795-9455-163647959214981/AnsiballZ_slurp.py <<< 9455 1726776642.16902: stderr chunk (state=3): >>><<< 9455 1726776642.16911: stdout chunk (state=3): >>><<< 9455 1726776642.16935: done transferring module to remote 9455 1726776642.16949: _low_level_execute_command(): starting 9455 1726776642.16956: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776642.0907795-9455-163647959214981/ /root/.ansible/tmp/ansible-tmp-1726776642.0907795-9455-163647959214981/AnsiballZ_slurp.py && sleep 0' 9455 1726776642.19585: stderr chunk (state=2): >>><<< 9455 1726776642.19593: stdout chunk (state=2): >>><<< 9455 1726776642.19611: _low_level_execute_command() done: rc=0, stdout=, stderr= 9455 1726776642.19617: _low_level_execute_command(): starting 9455 1726776642.19622: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776642.0907795-9455-163647959214981/AnsiballZ_slurp.py && sleep 0' 9455 1726776642.34833: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 9455 1726776642.35842: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 9455 1726776642.35882: stderr chunk (state=3): >>><<< 9455 1726776642.35890: stdout chunk (state=3): >>><<< 9455 1726776642.35905: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.8.186 closed. 9455 1726776642.35925: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776642.0907795-9455-163647959214981/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9455 1726776642.35936: _low_level_execute_command(): starting 9455 1726776642.35940: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776642.0907795-9455-163647959214981/ > /dev/null 2>&1 && sleep 0' 9455 1726776642.38310: stderr chunk (state=2): >>><<< 9455 1726776642.38317: stdout chunk (state=2): >>><<< 9455 1726776642.38332: _low_level_execute_command() done: rc=0, stdout=, stderr= 9455 1726776642.38338: handler run complete 9455 1726776642.38351: attempt loop complete, returning result 9455 1726776642.38354: _execute() done 9455 1726776642.38357: dumping result to json 9455 1726776642.38361: done dumping result, returning 9455 1726776642.38368: done running TaskExecutor() for managed_node3/TASK: Get active_profile [120fa90a-8a95-c4e4-06a7-000000000156] 9455 1726776642.38374: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000156 9455 1726776642.38405: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000156 9455 1726776642.38408: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "encoding": "base64", "source": "/etc/tuned/active_profile" } 8283 1726776642.38577: no more pending results, returning what we have 8283 1726776642.38581: results queue empty 8283 1726776642.38581: checking for any_errors_fatal 8283 1726776642.38590: done checking for any_errors_fatal 8283 1726776642.38591: checking for max_fail_percentage 8283 1726776642.38592: done checking for max_fail_percentage 8283 1726776642.38593: checking to see if all hosts have failed and the running result is not ok 8283 1726776642.38593: done checking to see if all hosts have failed 8283 1726776642.38594: getting the remaining hosts for this loop 8283 1726776642.38595: done getting the remaining hosts for this loop 8283 1726776642.38600: getting the next task for host managed_node3 8283 1726776642.38607: done getting next task for host managed_node3 8283 1726776642.38609: ^ task is: TASK: Ensure kernel_settings is not in active_profile 8283 1726776642.38612: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776642.38615: getting variables 8283 1726776642.38617: in VariableManager get_vars() 8283 1726776642.38645: Calling all_inventory to load vars for managed_node3 8283 1726776642.38647: Calling groups_inventory to load vars for managed_node3 8283 1726776642.38648: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776642.38655: Calling all_plugins_play to load vars for managed_node3 8283 1726776642.38657: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776642.38658: Calling groups_plugins_play to load vars for managed_node3 8283 1726776642.38692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776642.38720: done with get_vars() 8283 1726776642.38726: done getting variables 8283 1726776642.38768: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure kernel_settings is not in active_profile] ************************* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:46 Thursday 19 September 2024 16:10:42 -0400 (0:00:00.338) 0:00:26.092 **** 8283 1726776642.38788: entering _queue_task() for managed_node3/copy 8283 1726776642.38949: worker is 1 (out of 1 available) 8283 1726776642.38965: exiting _queue_task() for managed_node3/copy 8283 1726776642.38977: done queuing things up, now waiting for results queue to drain 8283 1726776642.38979: waiting for pending results... 9468 1726776642.39091: running TaskExecutor() for managed_node3/TASK: Ensure kernel_settings is not in active_profile 9468 1726776642.39187: in run() - task 120fa90a-8a95-c4e4-06a7-000000000157 9468 1726776642.39202: variable 'ansible_search_path' from source: unknown 9468 1726776642.39206: variable 'ansible_search_path' from source: unknown 9468 1726776642.39238: calling self._execute() 9468 1726776642.39290: variable 'ansible_host' from source: host vars for 'managed_node3' 9468 1726776642.39299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9468 1726776642.39308: variable 'omit' from source: magic vars 9468 1726776642.39382: variable 'omit' from source: magic vars 9468 1726776642.39412: variable 'omit' from source: magic vars 9468 1726776642.39434: variable '__active_profile' from source: task vars 9468 1726776642.39638: variable '__active_profile' from source: task vars 9468 1726776642.39775: variable '__cur_profile' from source: task vars 9468 1726776642.39877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9468 1726776642.41358: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9468 1726776642.41403: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9468 1726776642.41434: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9468 1726776642.41460: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9468 1726776642.41480: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9468 1726776642.41536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9468 1726776642.41568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9468 1726776642.41585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9468 1726776642.41610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9468 1726776642.41619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9468 1726776642.41689: variable '__kernel_settings_tuned_current_profile' from source: set_fact 9468 1726776642.41726: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars 9468 1726776642.41773: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars 9468 1726776642.41820: variable '__kernel_settings_tuned_dir' from source: role '' exported vars 9468 1726776642.41839: variable 'omit' from source: magic vars 9468 1726776642.41859: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9468 1726776642.41878: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9468 1726776642.41890: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9468 1726776642.41902: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9468 1726776642.41909: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9468 1726776642.41932: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9468 1726776642.41936: variable 'ansible_host' from source: host vars for 'managed_node3' 9468 1726776642.41939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9468 1726776642.41999: Set connection var ansible_module_compression to ZIP_DEFLATED 9468 1726776642.42006: Set connection var ansible_shell_type to sh 9468 1726776642.42009: Set connection var ansible_timeout to 10 9468 1726776642.42012: Set connection var ansible_connection to ssh 9468 1726776642.42017: Set connection var ansible_pipelining to False 9468 1726776642.42020: Set connection var ansible_shell_executable to /bin/sh 9468 1726776642.42039: variable 'ansible_shell_executable' from source: unknown 9468 1726776642.42044: variable 'ansible_connection' from source: unknown 9468 1726776642.42047: variable 'ansible_module_compression' from source: unknown 9468 1726776642.42050: variable 'ansible_shell_type' from source: unknown 9468 1726776642.42053: variable 'ansible_shell_executable' from source: unknown 9468 1726776642.42056: variable 'ansible_host' from source: host vars for 'managed_node3' 9468 1726776642.42060: variable 'ansible_pipelining' from source: unknown 9468 1726776642.42063: variable 'ansible_timeout' from source: unknown 9468 1726776642.42067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9468 1726776642.42132: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9468 1726776642.42142: variable 'omit' from source: magic vars 9468 1726776642.42148: starting attempt loop 9468 1726776642.42151: running the handler 9468 1726776642.42161: _low_level_execute_command(): starting 9468 1726776642.42167: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9468 1726776642.44445: stdout chunk (state=2): >>>/root <<< 9468 1726776642.44564: stderr chunk (state=3): >>><<< 9468 1726776642.44572: stdout chunk (state=3): >>><<< 9468 1726776642.44591: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9468 1726776642.44604: _low_level_execute_command(): starting 9468 1726776642.44609: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776642.4459972-9468-236712635974084 `" && echo ansible-tmp-1726776642.4459972-9468-236712635974084="` echo /root/.ansible/tmp/ansible-tmp-1726776642.4459972-9468-236712635974084 `" ) && sleep 0' 9468 1726776642.47806: stdout chunk (state=2): >>>ansible-tmp-1726776642.4459972-9468-236712635974084=/root/.ansible/tmp/ansible-tmp-1726776642.4459972-9468-236712635974084 <<< 9468 1726776642.47931: stderr chunk (state=3): >>><<< 9468 1726776642.47938: stdout chunk (state=3): >>><<< 9468 1726776642.47952: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776642.4459972-9468-236712635974084=/root/.ansible/tmp/ansible-tmp-1726776642.4459972-9468-236712635974084 , stderr= 9468 1726776642.48019: variable 'ansible_module_compression' from source: unknown 9468 1726776642.48058: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 9468 1726776642.48085: variable 'ansible_facts' from source: unknown 9468 1726776642.48159: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776642.4459972-9468-236712635974084/AnsiballZ_stat.py 9468 1726776642.48245: Sending initial data 9468 1726776642.48252: Sent initial data (151 bytes) 9468 1726776642.50699: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpbh41xn6w /root/.ansible/tmp/ansible-tmp-1726776642.4459972-9468-236712635974084/AnsiballZ_stat.py <<< 9468 1726776642.51674: stderr chunk (state=3): >>><<< 9468 1726776642.51680: stdout chunk (state=3): >>><<< 9468 1726776642.51696: done transferring module to remote 9468 1726776642.51707: _low_level_execute_command(): starting 9468 1726776642.51712: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776642.4459972-9468-236712635974084/ /root/.ansible/tmp/ansible-tmp-1726776642.4459972-9468-236712635974084/AnsiballZ_stat.py && sleep 0' 9468 1726776642.54046: stderr chunk (state=2): >>><<< 9468 1726776642.54052: stdout chunk (state=2): >>><<< 9468 1726776642.54063: _low_level_execute_command() done: rc=0, stdout=, stderr= 9468 1726776642.54067: _low_level_execute_command(): starting 9468 1726776642.54072: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776642.4459972-9468-236712635974084/AnsiballZ_stat.py && sleep 0' 9468 1726776642.70346: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 499122370, "dev": 51713, "nlink": 1, "atime": 1726776639.0549896, "mtime": 1726776631.5598924, "ctime": 1726776631.5598924, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "3787864203", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 9468 1726776642.71499: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 9468 1726776642.71546: stderr chunk (state=3): >>><<< 9468 1726776642.71553: stdout chunk (state=3): >>><<< 9468 1726776642.71569: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 499122370, "dev": 51713, "nlink": 1, "atime": 1726776639.0549896, "mtime": 1726776631.5598924, "ctime": 1726776631.5598924, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "3787864203", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.8.186 closed. 9468 1726776642.71616: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776642.4459972-9468-236712635974084/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9468 1726776642.71698: Sending initial data 9468 1726776642.71705: Sent initial data (140 bytes) 9468 1726776642.74178: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmp8jopseoy /root/.ansible/tmp/ansible-tmp-1726776642.4459972-9468-236712635974084/source <<< 9468 1726776642.74440: stderr chunk (state=3): >>><<< 9468 1726776642.74447: stdout chunk (state=3): >>><<< 9468 1726776642.74466: _low_level_execute_command(): starting 9468 1726776642.74473: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776642.4459972-9468-236712635974084/ /root/.ansible/tmp/ansible-tmp-1726776642.4459972-9468-236712635974084/source && sleep 0' 9468 1726776642.76935: stderr chunk (state=2): >>><<< 9468 1726776642.76946: stdout chunk (state=2): >>><<< 9468 1726776642.76966: _low_level_execute_command() done: rc=0, stdout=, stderr= 9468 1726776642.76993: variable 'ansible_module_compression' from source: unknown 9468 1726776642.77044: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 9468 1726776642.77066: variable 'ansible_facts' from source: unknown 9468 1726776642.77155: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776642.4459972-9468-236712635974084/AnsiballZ_copy.py 9468 1726776642.77612: Sending initial data 9468 1726776642.77619: Sent initial data (151 bytes) 9468 1726776642.80836: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpv3ya9hfc /root/.ansible/tmp/ansible-tmp-1726776642.4459972-9468-236712635974084/AnsiballZ_copy.py <<< 9468 1726776642.81934: stderr chunk (state=3): >>><<< 9468 1726776642.81942: stdout chunk (state=3): >>><<< 9468 1726776642.81960: done transferring module to remote 9468 1726776642.81969: _low_level_execute_command(): starting 9468 1726776642.81974: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776642.4459972-9468-236712635974084/ /root/.ansible/tmp/ansible-tmp-1726776642.4459972-9468-236712635974084/AnsiballZ_copy.py && sleep 0' 9468 1726776642.84515: stderr chunk (state=2): >>><<< 9468 1726776642.84523: stdout chunk (state=2): >>><<< 9468 1726776642.84540: _low_level_execute_command() done: rc=0, stdout=, stderr= 9468 1726776642.84545: _low_level_execute_command(): starting 9468 1726776642.84551: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776642.4459972-9468-236712635974084/AnsiballZ_copy.py && sleep 0' 9468 1726776643.01285: stdout chunk (state=2): >>> {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726776642.4459972-9468-236712635974084/source", "md5sum": "9a561d913bcdb5a659ec2dd035975a8e", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 14, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726776642.4459972-9468-236712635974084/source", "_original_basename": "tmp8jopseoy", "follow": false, "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 9468 1726776643.02455: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 9468 1726776643.02504: stderr chunk (state=3): >>><<< 9468 1726776643.02511: stdout chunk (state=3): >>><<< 9468 1726776643.02526: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726776642.4459972-9468-236712635974084/source", "md5sum": "9a561d913bcdb5a659ec2dd035975a8e", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 14, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726776642.4459972-9468-236712635974084/source", "_original_basename": "tmp8jopseoy", "follow": false, "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.8.186 closed. 9468 1726776643.02553: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/active_profile', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726776642.4459972-9468-236712635974084/source', '_original_basename': 'tmp8jopseoy', 'follow': False, 'checksum': '633f07e1b5698d04352d5dca735869bf2fe77897', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776642.4459972-9468-236712635974084/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9468 1726776643.02564: _low_level_execute_command(): starting 9468 1726776643.02569: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776642.4459972-9468-236712635974084/ > /dev/null 2>&1 && sleep 0' 9468 1726776643.04955: stderr chunk (state=2): >>><<< 9468 1726776643.04966: stdout chunk (state=2): >>><<< 9468 1726776643.04980: _low_level_execute_command() done: rc=0, stdout=, stderr= 9468 1726776643.04988: handler run complete 9468 1726776643.05010: attempt loop complete, returning result 9468 1726776643.05015: _execute() done 9468 1726776643.05018: dumping result to json 9468 1726776643.05023: done dumping result, returning 9468 1726776643.05032: done running TaskExecutor() for managed_node3/TASK: Ensure kernel_settings is not in active_profile [120fa90a-8a95-c4e4-06a7-000000000157] 9468 1726776643.05038: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000157 9468 1726776643.05072: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000157 9468 1726776643.05076: WORKER PROCESS EXITING changed: [managed_node3] => { "changed": true, "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "md5sum": "9a561d913bcdb5a659ec2dd035975a8e", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 14, "src": "/root/.ansible/tmp/ansible-tmp-1726776642.4459972-9468-236712635974084/source", "state": "file", "uid": 0 } 8283 1726776643.05236: no more pending results, returning what we have 8283 1726776643.05239: results queue empty 8283 1726776643.05240: checking for any_errors_fatal 8283 1726776643.05246: done checking for any_errors_fatal 8283 1726776643.05246: checking for max_fail_percentage 8283 1726776643.05248: done checking for max_fail_percentage 8283 1726776643.05248: checking to see if all hosts have failed and the running result is not ok 8283 1726776643.05249: done checking to see if all hosts have failed 8283 1726776643.05249: getting the remaining hosts for this loop 8283 1726776643.05250: done getting the remaining hosts for this loop 8283 1726776643.05253: getting the next task for host managed_node3 8283 1726776643.05258: done getting next task for host managed_node3 8283 1726776643.05262: ^ task is: TASK: Set profile_mode to auto 8283 1726776643.05265: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776643.05268: getting variables 8283 1726776643.05269: in VariableManager get_vars() 8283 1726776643.05300: Calling all_inventory to load vars for managed_node3 8283 1726776643.05303: Calling groups_inventory to load vars for managed_node3 8283 1726776643.05305: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776643.05314: Calling all_plugins_play to load vars for managed_node3 8283 1726776643.05316: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776643.05318: Calling groups_plugins_play to load vars for managed_node3 8283 1726776643.05363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776643.05388: done with get_vars() 8283 1726776643.05395: done getting variables 8283 1726776643.05438: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set profile_mode to auto] ************************************************ task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:57 Thursday 19 September 2024 16:10:43 -0400 (0:00:00.666) 0:00:26.759 **** 8283 1726776643.05460: entering _queue_task() for managed_node3/copy 8283 1726776643.05624: worker is 1 (out of 1 available) 8283 1726776643.05641: exiting _queue_task() for managed_node3/copy 8283 1726776643.05653: done queuing things up, now waiting for results queue to drain 8283 1726776643.05654: waiting for pending results... 9498 1726776643.05773: running TaskExecutor() for managed_node3/TASK: Set profile_mode to auto 9498 1726776643.05875: in run() - task 120fa90a-8a95-c4e4-06a7-000000000158 9498 1726776643.05890: variable 'ansible_search_path' from source: unknown 9498 1726776643.05894: variable 'ansible_search_path' from source: unknown 9498 1726776643.05923: calling self._execute() 9498 1726776643.05975: variable 'ansible_host' from source: host vars for 'managed_node3' 9498 1726776643.05984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9498 1726776643.05992: variable 'omit' from source: magic vars 9498 1726776643.06072: variable 'omit' from source: magic vars 9498 1726776643.06106: variable 'omit' from source: magic vars 9498 1726776643.06127: variable '__kernel_settings_tuned_profile_mode' from source: role '' exported vars 9498 1726776643.06403: variable '__kernel_settings_tuned_profile_mode' from source: role '' exported vars 9498 1726776643.06463: variable '__kernel_settings_tuned_dir' from source: role '' exported vars 9498 1726776643.06493: variable 'omit' from source: magic vars 9498 1726776643.06530: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9498 1726776643.06558: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9498 1726776643.06576: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9498 1726776643.06591: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9498 1726776643.06605: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9498 1726776643.06630: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9498 1726776643.06635: variable 'ansible_host' from source: host vars for 'managed_node3' 9498 1726776643.06640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9498 1726776643.06709: Set connection var ansible_module_compression to ZIP_DEFLATED 9498 1726776643.06717: Set connection var ansible_shell_type to sh 9498 1726776643.06724: Set connection var ansible_timeout to 10 9498 1726776643.06730: Set connection var ansible_connection to ssh 9498 1726776643.06737: Set connection var ansible_pipelining to False 9498 1726776643.06742: Set connection var ansible_shell_executable to /bin/sh 9498 1726776643.06757: variable 'ansible_shell_executable' from source: unknown 9498 1726776643.06760: variable 'ansible_connection' from source: unknown 9498 1726776643.06762: variable 'ansible_module_compression' from source: unknown 9498 1726776643.06763: variable 'ansible_shell_type' from source: unknown 9498 1726776643.06765: variable 'ansible_shell_executable' from source: unknown 9498 1726776643.06766: variable 'ansible_host' from source: host vars for 'managed_node3' 9498 1726776643.06768: variable 'ansible_pipelining' from source: unknown 9498 1726776643.06770: variable 'ansible_timeout' from source: unknown 9498 1726776643.06772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9498 1726776643.06879: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9498 1726776643.06890: variable 'omit' from source: magic vars 9498 1726776643.06895: starting attempt loop 9498 1726776643.06901: running the handler 9498 1726776643.06910: _low_level_execute_command(): starting 9498 1726776643.06915: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9498 1726776643.09183: stdout chunk (state=2): >>>/root <<< 9498 1726776643.09303: stderr chunk (state=3): >>><<< 9498 1726776643.09309: stdout chunk (state=3): >>><<< 9498 1726776643.09327: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9498 1726776643.09340: _low_level_execute_command(): starting 9498 1726776643.09344: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776643.0933545-9498-28517041140798 `" && echo ansible-tmp-1726776643.0933545-9498-28517041140798="` echo /root/.ansible/tmp/ansible-tmp-1726776643.0933545-9498-28517041140798 `" ) && sleep 0' 9498 1726776643.11752: stdout chunk (state=2): >>>ansible-tmp-1726776643.0933545-9498-28517041140798=/root/.ansible/tmp/ansible-tmp-1726776643.0933545-9498-28517041140798 <<< 9498 1726776643.11878: stderr chunk (state=3): >>><<< 9498 1726776643.11884: stdout chunk (state=3): >>><<< 9498 1726776643.11896: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776643.0933545-9498-28517041140798=/root/.ansible/tmp/ansible-tmp-1726776643.0933545-9498-28517041140798 , stderr= 9498 1726776643.11961: variable 'ansible_module_compression' from source: unknown 9498 1726776643.12000: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 9498 1726776643.12027: variable 'ansible_facts' from source: unknown 9498 1726776643.12098: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776643.0933545-9498-28517041140798/AnsiballZ_stat.py 9498 1726776643.12180: Sending initial data 9498 1726776643.12187: Sent initial data (150 bytes) 9498 1726776643.15246: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpzskwhwl_ /root/.ansible/tmp/ansible-tmp-1726776643.0933545-9498-28517041140798/AnsiballZ_stat.py <<< 9498 1726776643.16276: stderr chunk (state=3): >>><<< 9498 1726776643.16284: stdout chunk (state=3): >>><<< 9498 1726776643.16303: done transferring module to remote 9498 1726776643.16314: _low_level_execute_command(): starting 9498 1726776643.16319: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776643.0933545-9498-28517041140798/ /root/.ansible/tmp/ansible-tmp-1726776643.0933545-9498-28517041140798/AnsiballZ_stat.py && sleep 0' 9498 1726776643.18619: stderr chunk (state=2): >>><<< 9498 1726776643.18626: stdout chunk (state=2): >>><<< 9498 1726776643.18640: _low_level_execute_command() done: rc=0, stdout=, stderr= 9498 1726776643.18644: _low_level_execute_command(): starting 9498 1726776643.18650: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776643.0933545-9498-28517041140798/AnsiballZ_stat.py && sleep 0' 9498 1726776643.35093: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 505413763, "dev": 51713, "nlink": 1, "atime": 1726776639.9680014, "mtime": 1726776631.5608926, "ctime": 1726776631.5608926, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "3997735162", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 9498 1726776643.36023: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 9498 1726776643.36067: stderr chunk (state=3): >>><<< 9498 1726776643.36075: stdout chunk (state=3): >>><<< 9498 1726776643.36092: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 505413763, "dev": 51713, "nlink": 1, "atime": 1726776639.9680014, "mtime": 1726776631.5608926, "ctime": 1726776631.5608926, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "3997735162", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.8.186 closed. 9498 1726776643.36144: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776643.0933545-9498-28517041140798/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9498 1726776643.36230: Sending initial data 9498 1726776643.36239: Sent initial data (139 bytes) 9498 1726776643.38715: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpo78fn48v /root/.ansible/tmp/ansible-tmp-1726776643.0933545-9498-28517041140798/source <<< 9498 1726776643.38958: stderr chunk (state=3): >>><<< 9498 1726776643.38965: stdout chunk (state=3): >>><<< 9498 1726776643.38983: _low_level_execute_command(): starting 9498 1726776643.38991: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776643.0933545-9498-28517041140798/ /root/.ansible/tmp/ansible-tmp-1726776643.0933545-9498-28517041140798/source && sleep 0' 9498 1726776643.41308: stderr chunk (state=2): >>><<< 9498 1726776643.41314: stdout chunk (state=2): >>><<< 9498 1726776643.41326: _low_level_execute_command() done: rc=0, stdout=, stderr= 9498 1726776643.41346: variable 'ansible_module_compression' from source: unknown 9498 1726776643.41383: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 9498 1726776643.41403: variable 'ansible_facts' from source: unknown 9498 1726776643.41459: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776643.0933545-9498-28517041140798/AnsiballZ_copy.py 9498 1726776643.41539: Sending initial data 9498 1726776643.41546: Sent initial data (150 bytes) 9498 1726776643.43964: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmpiez4wqig /root/.ansible/tmp/ansible-tmp-1726776643.0933545-9498-28517041140798/AnsiballZ_copy.py <<< 9498 1726776643.44953: stderr chunk (state=3): >>><<< 9498 1726776643.44960: stdout chunk (state=3): >>><<< 9498 1726776643.44975: done transferring module to remote 9498 1726776643.44982: _low_level_execute_command(): starting 9498 1726776643.44987: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776643.0933545-9498-28517041140798/ /root/.ansible/tmp/ansible-tmp-1726776643.0933545-9498-28517041140798/AnsiballZ_copy.py && sleep 0' 9498 1726776643.47680: stderr chunk (state=2): >>><<< 9498 1726776643.47689: stdout chunk (state=2): >>><<< 9498 1726776643.47703: _low_level_execute_command() done: rc=0, stdout=, stderr= 9498 1726776643.47708: _low_level_execute_command(): starting 9498 1726776643.47713: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776643.0933545-9498-28517041140798/AnsiballZ_copy.py && sleep 0' 9498 1726776643.64365: stdout chunk (state=2): >>> {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726776643.0933545-9498-28517041140798/source", "md5sum": "451e20aff0f489cd2f7d4d73533aa961", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 5, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726776643.0933545-9498-28517041140798/source", "_original_basename": "tmpo78fn48v", "follow": false, "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 9498 1726776643.65535: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 9498 1726776643.65579: stderr chunk (state=3): >>><<< 9498 1726776643.65585: stdout chunk (state=3): >>><<< 9498 1726776643.65600: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726776643.0933545-9498-28517041140798/source", "md5sum": "451e20aff0f489cd2f7d4d73533aa961", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 5, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726776643.0933545-9498-28517041140798/source", "_original_basename": "tmpo78fn48v", "follow": false, "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.8.186 closed. 9498 1726776643.65628: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/profile_mode', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726776643.0933545-9498-28517041140798/source', '_original_basename': 'tmpo78fn48v', 'follow': False, 'checksum': '43683f4e92c48be4b00ddd86e011a4f27fcdbeb5', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776643.0933545-9498-28517041140798/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9498 1726776643.65640: _low_level_execute_command(): starting 9498 1726776643.65647: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776643.0933545-9498-28517041140798/ > /dev/null 2>&1 && sleep 0' 9498 1726776643.67998: stderr chunk (state=2): >>><<< 9498 1726776643.68009: stdout chunk (state=2): >>><<< 9498 1726776643.68021: _low_level_execute_command() done: rc=0, stdout=, stderr= 9498 1726776643.68026: handler run complete 9498 1726776643.68047: attempt loop complete, returning result 9498 1726776643.68053: _execute() done 9498 1726776643.68056: dumping result to json 9498 1726776643.68060: done dumping result, returning 9498 1726776643.68065: done running TaskExecutor() for managed_node3/TASK: Set profile_mode to auto [120fa90a-8a95-c4e4-06a7-000000000158] 9498 1726776643.68071: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000158 9498 1726776643.68099: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000158 9498 1726776643.68103: WORKER PROCESS EXITING changed: [managed_node3] => { "changed": true, "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "md5sum": "451e20aff0f489cd2f7d4d73533aa961", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 5, "src": "/root/.ansible/tmp/ansible-tmp-1726776643.0933545-9498-28517041140798/source", "state": "file", "uid": 0 } 8283 1726776643.68380: no more pending results, returning what we have 8283 1726776643.68383: results queue empty 8283 1726776643.68384: checking for any_errors_fatal 8283 1726776643.68389: done checking for any_errors_fatal 8283 1726776643.68389: checking for max_fail_percentage 8283 1726776643.68390: done checking for max_fail_percentage 8283 1726776643.68391: checking to see if all hosts have failed and the running result is not ok 8283 1726776643.68391: done checking to see if all hosts have failed 8283 1726776643.68391: getting the remaining hosts for this loop 8283 1726776643.68392: done getting the remaining hosts for this loop 8283 1726776643.68394: getting the next task for host managed_node3 8283 1726776643.68399: done getting next task for host managed_node3 8283 1726776643.68401: ^ task is: TASK: Restart tuned 8283 1726776643.68403: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8283 1726776643.68405: getting variables 8283 1726776643.68406: in VariableManager get_vars() 8283 1726776643.68433: Calling all_inventory to load vars for managed_node3 8283 1726776643.68435: Calling groups_inventory to load vars for managed_node3 8283 1726776643.68437: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776643.68444: Calling all_plugins_play to load vars for managed_node3 8283 1726776643.68446: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776643.68448: Calling groups_plugins_play to load vars for managed_node3 8283 1726776643.68482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776643.68509: done with get_vars() 8283 1726776643.68515: done getting variables 8283 1726776643.68560: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Restart tuned] *********************************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:64 Thursday 19 September 2024 16:10:43 -0400 (0:00:00.631) 0:00:27.390 **** 8283 1726776643.68581: entering _queue_task() for managed_node3/service 8283 1726776643.68735: worker is 1 (out of 1 available) 8283 1726776643.68749: exiting _queue_task() for managed_node3/service 8283 1726776643.68760: done queuing things up, now waiting for results queue to drain 8283 1726776643.68762: waiting for pending results... 9536 1726776643.68876: running TaskExecutor() for managed_node3/TASK: Restart tuned 9536 1726776643.68974: in run() - task 120fa90a-8a95-c4e4-06a7-000000000159 9536 1726776643.68990: variable 'ansible_search_path' from source: unknown 9536 1726776643.68994: variable 'ansible_search_path' from source: unknown 9536 1726776643.69030: variable '__kernel_settings_services' from source: include_vars 9536 1726776643.69322: variable '__kernel_settings_services' from source: include_vars 9536 1726776643.69378: variable 'omit' from source: magic vars 9536 1726776643.69446: variable 'ansible_host' from source: host vars for 'managed_node3' 9536 1726776643.69456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9536 1726776643.69464: variable 'omit' from source: magic vars 9536 1726776643.69516: variable 'omit' from source: magic vars 9536 1726776643.69543: variable 'omit' from source: magic vars 9536 1726776643.69569: variable 'item' from source: unknown 9536 1726776643.69621: variable 'item' from source: unknown 9536 1726776643.69644: variable 'omit' from source: magic vars 9536 1726776643.69676: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9536 1726776643.69706: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9536 1726776643.69724: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9536 1726776643.69739: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9536 1726776643.69751: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9536 1726776643.69778: variable 'inventory_hostname' from source: host vars for 'managed_node3' 9536 1726776643.69787: variable 'ansible_host' from source: host vars for 'managed_node3' 9536 1726776643.69793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9536 1726776643.69873: Set connection var ansible_module_compression to ZIP_DEFLATED 9536 1726776643.69880: Set connection var ansible_shell_type to sh 9536 1726776643.69884: Set connection var ansible_timeout to 10 9536 1726776643.69887: Set connection var ansible_connection to ssh 9536 1726776643.69891: Set connection var ansible_pipelining to False 9536 1726776643.69894: Set connection var ansible_shell_executable to /bin/sh 9536 1726776643.69908: variable 'ansible_shell_executable' from source: unknown 9536 1726776643.69911: variable 'ansible_connection' from source: unknown 9536 1726776643.69913: variable 'ansible_module_compression' from source: unknown 9536 1726776643.69915: variable 'ansible_shell_type' from source: unknown 9536 1726776643.69916: variable 'ansible_shell_executable' from source: unknown 9536 1726776643.69918: variable 'ansible_host' from source: host vars for 'managed_node3' 9536 1726776643.69920: variable 'ansible_pipelining' from source: unknown 9536 1726776643.69922: variable 'ansible_timeout' from source: unknown 9536 1726776643.69924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 9536 1726776643.70016: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9536 1726776643.70026: variable 'omit' from source: magic vars 9536 1726776643.70033: starting attempt loop 9536 1726776643.70037: running the handler 9536 1726776643.70095: variable 'ansible_facts' from source: unknown 9536 1726776643.70126: _low_level_execute_command(): starting 9536 1726776643.70135: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9536 1726776643.72532: stdout chunk (state=2): >>>/root <<< 9536 1726776643.72650: stderr chunk (state=3): >>><<< 9536 1726776643.72657: stdout chunk (state=3): >>><<< 9536 1726776643.72675: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9536 1726776643.72688: _low_level_execute_command(): starting 9536 1726776643.72693: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776643.7268267-9536-19792094110261 `" && echo ansible-tmp-1726776643.7268267-9536-19792094110261="` echo /root/.ansible/tmp/ansible-tmp-1726776643.7268267-9536-19792094110261 `" ) && sleep 0' 9536 1726776643.75073: stdout chunk (state=2): >>>ansible-tmp-1726776643.7268267-9536-19792094110261=/root/.ansible/tmp/ansible-tmp-1726776643.7268267-9536-19792094110261 <<< 9536 1726776643.75195: stderr chunk (state=3): >>><<< 9536 1726776643.75205: stdout chunk (state=3): >>><<< 9536 1726776643.75221: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776643.7268267-9536-19792094110261=/root/.ansible/tmp/ansible-tmp-1726776643.7268267-9536-19792094110261 , stderr= 9536 1726776643.75245: variable 'ansible_module_compression' from source: unknown 9536 1726776643.75282: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 9536 1726776643.75332: variable 'ansible_facts' from source: unknown 9536 1726776643.75485: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776643.7268267-9536-19792094110261/AnsiballZ_setup.py 9536 1726776643.75590: Sending initial data 9536 1726776643.75597: Sent initial data (151 bytes) 9536 1726776643.77992: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmp26dfixi7 /root/.ansible/tmp/ansible-tmp-1726776643.7268267-9536-19792094110261/AnsiballZ_setup.py <<< 9536 1726776643.79788: stderr chunk (state=3): >>><<< 9536 1726776643.79795: stdout chunk (state=3): >>><<< 9536 1726776643.79815: done transferring module to remote 9536 1726776643.79825: _low_level_execute_command(): starting 9536 1726776643.79832: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776643.7268267-9536-19792094110261/ /root/.ansible/tmp/ansible-tmp-1726776643.7268267-9536-19792094110261/AnsiballZ_setup.py && sleep 0' 9536 1726776643.82125: stderr chunk (state=2): >>><<< 9536 1726776643.82133: stdout chunk (state=2): >>><<< 9536 1726776643.82146: _low_level_execute_command() done: rc=0, stdout=, stderr= 9536 1726776643.82150: _low_level_execute_command(): starting 9536 1726776643.82155: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776643.7268267-9536-19792094110261/AnsiballZ_setup.py && sleep 0' 9536 1726776644.09646: stdout chunk (state=2): >>> {"ansible_facts": {"ansible_service_mgr": "systemd"}, "invocation": {"module_args": {"gather_subset": ["!all"], "filter": ["ansible_service_mgr"], "gather_timeout": 10, "fact_path": "/etc/ansible/facts.d"}}} <<< 9536 1726776644.11280: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 9536 1726776644.11331: stderr chunk (state=3): >>><<< 9536 1726776644.11338: stdout chunk (state=3): >>><<< 9536 1726776644.11353: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_service_mgr": "systemd"}, "invocation": {"module_args": {"gather_subset": ["!all"], "filter": ["ansible_service_mgr"], "gather_timeout": 10, "fact_path": "/etc/ansible/facts.d"}}} , stderr=Shared connection to 10.31.8.186 closed. 9536 1726776644.11383: done with _execute_module (ansible.legacy.setup, {'gather_subset': '!all', 'filter': 'ansible_service_mgr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776643.7268267-9536-19792094110261/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9536 1726776644.11403: Facts {'ansible_facts': {'ansible_service_mgr': 'systemd'}, 'invocation': {'module_args': {'gather_subset': ['!all'], 'filter': ['ansible_service_mgr'], 'gather_timeout': 10, 'fact_path': '/etc/ansible/facts.d'}}, '_ansible_parsed': True} 9536 1726776644.11459: variable 'ansible_module_compression' from source: unknown 9536 1726776644.11494: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8283flof6zsc/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 9536 1726776644.11542: variable 'ansible_facts' from source: unknown 9536 1726776644.11697: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776643.7268267-9536-19792094110261/AnsiballZ_systemd.py 9536 1726776644.11796: Sending initial data 9536 1726776644.11806: Sent initial data (153 bytes) 9536 1726776644.14316: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8283flof6zsc/tmp57l3ojuu /root/.ansible/tmp/ansible-tmp-1726776643.7268267-9536-19792094110261/AnsiballZ_systemd.py <<< 9536 1726776644.16451: stderr chunk (state=3): >>><<< 9536 1726776644.16464: stdout chunk (state=3): >>><<< 9536 1726776644.16483: done transferring module to remote 9536 1726776644.16490: _low_level_execute_command(): starting 9536 1726776644.16493: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776643.7268267-9536-19792094110261/ /root/.ansible/tmp/ansible-tmp-1726776643.7268267-9536-19792094110261/AnsiballZ_systemd.py && sleep 0' 9536 1726776644.19090: stderr chunk (state=2): >>><<< 9536 1726776644.19100: stdout chunk (state=2): >>><<< 9536 1726776644.19117: _low_level_execute_command() done: rc=0, stdout=, stderr= 9536 1726776644.19122: _low_level_execute_command(): starting 9536 1726776644.19128: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776643.7268267-9536-19792094110261/AnsiballZ_systemd.py && sleep 0' 9536 1726776644.47015: stdout chunk (state=2): >>> {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:10:31 EDT", "WatchdogTimestampMonotonic": "239516555", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "9800", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:31 EDT", "ExecMainStartTimestampMonotonic": "239379769", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "9800", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:31 EDT] ; stop_time=[n/a] ; pid=9800 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "15011840", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryH<<< 9536 1726776644.47055: stdout chunk (state=3): >>>igh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "sysinit.target system.slice dbus.socket dbus.service", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service tlp.service power-profiles-daemon.service shutdown.target cpupower.service", "Before": "multi-user.target shutdown.target", "After": "system.slice systemd-journald.socket dbus.service network.target sysinit.target basic.target dbus.socket systemd-sysctl.service polkit.service", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:10:31 EDT", "Stat<<< 9536 1726776644.47070: stdout chunk (state=3): >>>eChangeTimestampMonotonic": "239516558", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:31 EDT", "InactiveExitTimestampMonotonic": "239379946", "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:31 EDT", "ActiveEnterTimestampMonotonic": "239516558", "ActiveExitTimestamp": "Thu 2024-09-19 16:10:31 EDT", "ActiveExitTimestampMonotonic": "239266126", "InactiveEnterTimestamp": "Thu 2024-09-19 16:10:31 EDT", "InactiveEnterTimestampMonotonic": "239376765", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:31 EDT", "ConditionTimestampMonotonic": "239377966", "AssertTimestamp": "Thu 2024-09-19 16:10:31 EDT", "AssertTimestampMonotonic": "239377967", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "d22f365eaba74b3a87958cacc5d42cbe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 9536 1726776644.48867: stderr chunk (state=3): >>>Shared connection to 10.31.8.186 closed. <<< 9536 1726776644.48877: stdout chunk (state=3): >>><<< 9536 1726776644.48887: stderr chunk (state=3): >>><<< 9536 1726776644.48910: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:10:31 EDT", "WatchdogTimestampMonotonic": "239516555", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "9800", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:31 EDT", "ExecMainStartTimestampMonotonic": "239379769", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "9800", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:31 EDT] ; stop_time=[n/a] ; pid=9800 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "15011840", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "sysinit.target system.slice dbus.socket dbus.service", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service tlp.service power-profiles-daemon.service shutdown.target cpupower.service", "Before": "multi-user.target shutdown.target", "After": "system.slice systemd-journald.socket dbus.service network.target sysinit.target basic.target dbus.socket systemd-sysctl.service polkit.service", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:10:31 EDT", "StateChangeTimestampMonotonic": "239516558", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:31 EDT", "InactiveExitTimestampMonotonic": "239379946", "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:31 EDT", "ActiveEnterTimestampMonotonic": "239516558", "ActiveExitTimestamp": "Thu 2024-09-19 16:10:31 EDT", "ActiveExitTimestampMonotonic": "239266126", "InactiveEnterTimestamp": "Thu 2024-09-19 16:10:31 EDT", "InactiveEnterTimestampMonotonic": "239376765", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:31 EDT", "ConditionTimestampMonotonic": "239377966", "AssertTimestamp": "Thu 2024-09-19 16:10:31 EDT", "AssertTimestampMonotonic": "239377967", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "d22f365eaba74b3a87958cacc5d42cbe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.8.186 closed. 9536 1726776644.49099: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776643.7268267-9536-19792094110261/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9536 1726776644.49123: _low_level_execute_command(): starting 9536 1726776644.49131: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776643.7268267-9536-19792094110261/ > /dev/null 2>&1 && sleep 0' 9536 1726776644.52484: stderr chunk (state=2): >>><<< 9536 1726776644.52492: stdout chunk (state=2): >>><<< 9536 1726776644.52507: _low_level_execute_command() done: rc=0, stdout=, stderr= 9536 1726776644.52514: handler run complete 9536 1726776644.52547: attempt loop complete, returning result 9536 1726776644.52564: variable 'item' from source: unknown 9536 1726776644.52636: variable 'item' from source: unknown ok: [managed_node3] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:31 EDT", "ActiveEnterTimestampMonotonic": "239516558", "ActiveExitTimestamp": "Thu 2024-09-19 16:10:31 EDT", "ActiveExitTimestampMonotonic": "239266126", "ActiveState": "active", "After": "system.slice systemd-journald.socket dbus.service network.target sysinit.target basic.target dbus.socket systemd-sysctl.service polkit.service", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 16:10:31 EDT", "AssertTimestampMonotonic": "239377967", "Before": "multi-user.target shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:31 EDT", "ConditionTimestampMonotonic": "239377966", "ConfigurationDirectoryMode": "0755", "Conflicts": "auto-cpufreq.service tlp.service power-profiles-daemon.service shutdown.target cpupower.service", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "9800", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:31 EDT", "ExecMainStartTimestampMonotonic": "239379769", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:31 EDT] ; stop_time=[n/a] ; pid=9800 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestamp": "Thu 2024-09-19 16:10:31 EDT", "InactiveEnterTimestampMonotonic": "239376765", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:31 EDT", "InactiveExitTimestampMonotonic": "239379946", "InvocationID": "d22f365eaba74b3a87958cacc5d42cbe", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "9800", "MemoryAccounting": "yes", "MemoryCurrent": "15011840", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "sysinit.target system.slice dbus.socket dbus.service", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 16:10:31 EDT", "StateChangeTimestampMonotonic": "239516558", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 16:10:31 EDT", "WatchdogTimestampMonotonic": "239516555", "WatchdogUSec": "0" } } 9536 1726776644.52754: dumping result to json 9536 1726776644.52775: done dumping result, returning 9536 1726776644.52782: done running TaskExecutor() for managed_node3/TASK: Restart tuned [120fa90a-8a95-c4e4-06a7-000000000159] 9536 1726776644.52788: sending task result for task 120fa90a-8a95-c4e4-06a7-000000000159 9536 1726776644.52897: done sending task result for task 120fa90a-8a95-c4e4-06a7-000000000159 9536 1726776644.52901: WORKER PROCESS EXITING 8283 1726776644.53265: no more pending results, returning what we have 8283 1726776644.53267: results queue empty 8283 1726776644.53267: checking for any_errors_fatal 8283 1726776644.53271: done checking for any_errors_fatal 8283 1726776644.53271: checking for max_fail_percentage 8283 1726776644.53272: done checking for max_fail_percentage 8283 1726776644.53272: checking to see if all hosts have failed and the running result is not ok 8283 1726776644.53273: done checking to see if all hosts have failed 8283 1726776644.53273: getting the remaining hosts for this loop 8283 1726776644.53274: done getting the remaining hosts for this loop 8283 1726776644.53277: getting the next task for host managed_node3 8283 1726776644.53282: done getting next task for host managed_node3 8283 1726776644.53283: ^ task is: TASK: meta (flush_handlers) 8283 1726776644.53284: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776644.53288: getting variables 8283 1726776644.53288: in VariableManager get_vars() 8283 1726776644.53309: Calling all_inventory to load vars for managed_node3 8283 1726776644.53311: Calling groups_inventory to load vars for managed_node3 8283 1726776644.53312: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776644.53319: Calling all_plugins_play to load vars for managed_node3 8283 1726776644.53320: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776644.53322: Calling groups_plugins_play to load vars for managed_node3 8283 1726776644.53364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776644.53394: done with get_vars() 8283 1726776644.53400: done getting variables 8283 1726776644.53460: in VariableManager get_vars() 8283 1726776644.53469: Calling all_inventory to load vars for managed_node3 8283 1726776644.53471: Calling groups_inventory to load vars for managed_node3 8283 1726776644.53472: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776644.53475: Calling all_plugins_play to load vars for managed_node3 8283 1726776644.53476: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776644.53478: Calling groups_plugins_play to load vars for managed_node3 8283 1726776644.53510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776644.53527: done with get_vars() 8283 1726776644.53538: done queuing things up, now waiting for results queue to drain 8283 1726776644.53539: results queue empty 8283 1726776644.53539: checking for any_errors_fatal 8283 1726776644.53543: done checking for any_errors_fatal 8283 1726776644.53544: checking for max_fail_percentage 8283 1726776644.53544: done checking for max_fail_percentage 8283 1726776644.53545: checking to see if all hosts have failed and the running result is not ok 8283 1726776644.53545: done checking to see if all hosts have failed 8283 1726776644.53545: getting the remaining hosts for this loop 8283 1726776644.53546: done getting the remaining hosts for this loop 8283 1726776644.53547: getting the next task for host managed_node3 8283 1726776644.53550: done getting next task for host managed_node3 8283 1726776644.53551: ^ task is: TASK: meta (flush_handlers) 8283 1726776644.53552: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776644.53554: getting variables 8283 1726776644.53554: in VariableManager get_vars() 8283 1726776644.53560: Calling all_inventory to load vars for managed_node3 8283 1726776644.53562: Calling groups_inventory to load vars for managed_node3 8283 1726776644.53563: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776644.53566: Calling all_plugins_play to load vars for managed_node3 8283 1726776644.53567: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776644.53568: Calling groups_plugins_play to load vars for managed_node3 8283 1726776644.53587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776644.53604: done with get_vars() 8283 1726776644.53609: done getting variables 8283 1726776644.53638: in VariableManager get_vars() 8283 1726776644.53646: Calling all_inventory to load vars for managed_node3 8283 1726776644.53647: Calling groups_inventory to load vars for managed_node3 8283 1726776644.53648: Calling all_plugins_inventory to load vars for managed_node3 8283 1726776644.53651: Calling all_plugins_play to load vars for managed_node3 8283 1726776644.53652: Calling groups_plugins_inventory to load vars for managed_node3 8283 1726776644.53653: Calling groups_plugins_play to load vars for managed_node3 8283 1726776644.53672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8283 1726776644.53687: done with get_vars() 8283 1726776644.53693: done queuing things up, now waiting for results queue to drain 8283 1726776644.53694: results queue empty 8283 1726776644.53694: checking for any_errors_fatal 8283 1726776644.53696: done checking for any_errors_fatal 8283 1726776644.53696: checking for max_fail_percentage 8283 1726776644.53697: done checking for max_fail_percentage 8283 1726776644.53697: checking to see if all hosts have failed and the running result is not ok 8283 1726776644.53697: done checking to see if all hosts have failed 8283 1726776644.53698: getting the remaining hosts for this loop 8283 1726776644.53698: done getting the remaining hosts for this loop 8283 1726776644.53699: getting the next task for host managed_node3 8283 1726776644.53703: done getting next task for host managed_node3 8283 1726776644.53704: ^ task is: None 8283 1726776644.53705: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8283 1726776644.53705: done queuing things up, now waiting for results queue to drain 8283 1726776644.53706: results queue empty 8283 1726776644.53706: checking for any_errors_fatal 8283 1726776644.53706: done checking for any_errors_fatal 8283 1726776644.53707: checking for max_fail_percentage 8283 1726776644.53707: done checking for max_fail_percentage 8283 1726776644.53707: checking to see if all hosts have failed and the running result is not ok 8283 1726776644.53708: done checking to see if all hosts have failed 8283 1726776644.53709: getting the next task for host managed_node3 8283 1726776644.53710: done getting next task for host managed_node3 8283 1726776644.53711: ^ task is: None 8283 1726776644.53711: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node3 : ok=48 changed=8 unreachable=0 failed=0 skipped=19 rescued=0 ignored=0 Thursday 19 September 2024 16:10:44 -0400 (0:00:00.852) 0:00:28.242 **** =============================================================================== fedora.linux_system_roles.kernel_settings : Ensure required packages are installed --- 6.39s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 fedora.linux_system_roles.kernel_settings : Ensure required packages are installed --- 3.17s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes --- 1.14s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started --- 1.11s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role --- 0.93s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:2 fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started --- 0.85s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile --- 0.85s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 Restart tuned ----------------------------------------------------------- 0.85s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:64 fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory --- 0.82s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory --- 0.70s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 fedora.linux_system_roles.kernel_settings : Apply kernel settings ------- 0.68s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 Ensure kernel_settings is not in active_profile ------------------------- 0.67s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:46 fedora.linux_system_roles.kernel_settings : Set profile_mode to manual --- 0.66s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 fedora.linux_system_roles.kernel_settings : Apply kernel settings ------- 0.66s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 Set profile_mode to auto ------------------------------------------------ 0.63s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:57 fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly --- 0.58s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:2 fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile --- 0.56s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 fedora.linux_system_roles.kernel_settings : Set profile_mode to manual --- 0.56s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 fedora.linux_system_roles.kernel_settings : Read tuned main config ------ 0.51s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42 fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists --- 0.46s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74 8283 1726776644.53807: RUNNING CLEANUP