[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var 
naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This 
feature will be removed from ansible-core in version 2.19. Deprecation warnings
 can be disabled by setting deprecation_warnings=False in ansible.cfg.
  8303 1726773023.26034: starting run
ansible-playbook [core 2.16.11]
  config file = None
  configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/local/lib/python3.12/site-packages/ansible
  ansible collection location = /tmp/collections-EI7
  executable location = /usr/local/bin/ansible-playbook
  python version = 3.12.1 (main, Feb 21 2024, 14:18:26) [GCC 8.5.0 20210514 (Red Hat 8.5.0-21)] (/usr/bin/python3.12)
  jinja version = 3.1.4
  libyaml = True
No config file found; using defaults
  8303 1726773023.26500: Added group all to inventory
  8303 1726773023.26502: Added group ungrouped to inventory
  8303 1726773023.26506: Group all now contains ungrouped
  8303 1726773023.26509: Examining possible inventory source: /tmp/kernel_settings-PVh/inventory.yml
  8303 1726773023.35597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache
  8303 1726773023.35639: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py
  8303 1726773023.35659: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory
  8303 1726773023.35702: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py
  8303 1726773023.35750: Loaded config def from plugin (inventory/script)
  8303 1726773023.35752: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py
  8303 1726773023.35784: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py
  8303 1726773023.35843: Loaded config def from plugin (inventory/yaml)
  8303 1726773023.35845: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py
  8303 1726773023.35909: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py
  8303 1726773023.36191: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py
  8303 1726773023.36194: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py)
  8303 1726773023.36196: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py)
  8303 1726773023.36200: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py)
  8303 1726773023.36204: Loading data from /tmp/kernel_settings-PVh/inventory.yml
  8303 1726773023.36247: /tmp/kernel_settings-PVh/inventory.yml was not parsable by auto
  8303 1726773023.36293: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py)
  8303 1726773023.36323: Loading data from /tmp/kernel_settings-PVh/inventory.yml
  8303 1726773023.36379: group all already in inventory
  8303 1726773023.36384: set inventory_file for managed_node1
  8303 1726773023.36389: set inventory_dir for managed_node1
  8303 1726773023.36390: Added host managed_node1 to inventory
  8303 1726773023.36391: Added host managed_node1 to group all
  8303 1726773023.36392: set ansible_host for managed_node1
  8303 1726773023.36392: set ansible_ssh_extra_args for managed_node1
  8303 1726773023.36394: set inventory_file for managed_node2
  8303 1726773023.36396: set inventory_dir for managed_node2
  8303 1726773023.36396: Added host managed_node2 to inventory
  8303 1726773023.36397: Added host managed_node2 to group all
  8303 1726773023.36397: set ansible_host for managed_node2
  8303 1726773023.36398: set ansible_ssh_extra_args for managed_node2
  8303 1726773023.36399: set inventory_file for managed_node3
  8303 1726773023.36401: set inventory_dir for managed_node3
  8303 1726773023.36401: Added host managed_node3 to inventory
  8303 1726773023.36401: Added host managed_node3 to group all
  8303 1726773023.36402: set ansible_host for managed_node3
  8303 1726773023.36403: set ansible_ssh_extra_args for managed_node3
  8303 1726773023.36404: Reconcile groups and hosts in inventory.
  8303 1726773023.36407: Group ungrouped now contains managed_node1
  8303 1726773023.36408: Group ungrouped now contains managed_node2
  8303 1726773023.36408: Group ungrouped now contains managed_node3
  8303 1726773023.36464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name
  8303 1726773023.36546: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments
  8303 1726773023.36578: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py
  8303 1726773023.36598: Loaded config def from plugin (vars/host_group_vars)
  8303 1726773023.36600: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True)
  8303 1726773023.36605: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars
  8303 1726773023.36610: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False)
  8303 1726773023.36636: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False)
  8303 1726773023.36874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773023.36940: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py
  8303 1726773023.36965: Loaded config def from plugin (connection/local)
  8303 1726773023.36968: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True)
  8303 1726773023.37305: Loaded config def from plugin (connection/paramiko_ssh)
  8303 1726773023.37308: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True)
  8303 1726773023.37912: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False)
  8303 1726773023.37938: Loaded config def from plugin (connection/psrp)
  8303 1726773023.37940: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True)
  8303 1726773023.38370: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False)
  8303 1726773023.38396: Loaded config def from plugin (connection/ssh)
  8303 1726773023.38399: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True)
  8303 1726773023.39586: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False)
  8303 1726773023.39612: Loaded config def from plugin (connection/winrm)
  8303 1726773023.39614: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True)
  8303 1726773023.39636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name
  8303 1726773023.39680: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py
  8303 1726773023.39722: Loaded config def from plugin (shell/cmd)
  8303 1726773023.39723: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True)
  8303 1726773023.39740: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False)
  8303 1726773023.39779: Loaded config def from plugin (shell/powershell)
  8303 1726773023.39780: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True)
  8303 1726773023.39822: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py
  8303 1726773023.39929: Loaded config def from plugin (shell/sh)
  8303 1726773023.39930: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True)
  8303 1726773023.39953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name
  8303 1726773023.40031: Loaded config def from plugin (become/runas)
  8303 1726773023.40032: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True)
  8303 1726773023.40145: Loaded config def from plugin (become/su)
  8303 1726773023.40146: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True)
  8303 1726773023.40245: Loaded config def from plugin (become/sudo)
  8303 1726773023.40246: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True)
running playbook inside collection fedora.linux_system_roles
  8303 1726773023.40271: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_default.yml
  8303 1726773023.40521: in VariableManager get_vars()
  8303 1726773023.40536: done with get_vars()
  8303 1726773023.40570: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback
  8303 1726773023.40579: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__
redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug
redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug
  8303 1726773023.40864: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py
  8303 1726773023.40951: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug)
  8303 1726773023.40953: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-EI7/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__)
  8303 1726773023.40977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name
  8303 1726773023.40995: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False)
  8303 1726773023.41096: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py
  8303 1726773023.41133: Loaded config def from plugin (callback/default)
  8303 1726773023.41135: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True)
  8303 1726773023.41958: Loaded config def from plugin (callback/junit)
  8303 1726773023.41960: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True)
  8303 1726773023.41996: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False)
  8303 1726773023.42034: Loaded config def from plugin (callback/minimal)
  8303 1726773023.42035: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True)
  8303 1726773023.42064: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True)
  8303 1726773023.42106: Loaded config def from plugin (callback/tree)
  8303 1726773023.42108: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True)
redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks
  8303 1726773023.42180: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks)
  8303 1726773023.42183: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-EI7/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True)
Skipping callback 'default', as we already have a stdout callback.
Skipping callback 'minimal', as we already have a stdout callback.
Skipping callback 'oneline', as we already have a stdout callback.

PLAYBOOK: tests_default.yml ****************************************************
1 plays in /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_default.yml
  8303 1726773023.42205: in VariableManager get_vars()
  8303 1726773023.42216: done with get_vars()
  8303 1726773023.42220: in VariableManager get_vars()
  8303 1726773023.42225: done with get_vars()
  8303 1726773023.42227: variable 'omit' from source: magic vars
  8303 1726773023.42252: in VariableManager get_vars()
  8303 1726773023.42263: done with get_vars()
  8303 1726773023.42277: variable 'omit' from source: magic vars

PLAY [Ensure that the role runs with default parameters] ***********************
  8303 1726773023.44360: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy
  8303 1726773023.44417: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py
  8303 1726773023.44449: getting the remaining hosts for this loop
  8303 1726773023.44450: done getting the remaining hosts for this loop
  8303 1726773023.44453: getting the next task for host managed_node3
  8303 1726773023.44458: done getting next task for host managed_node3
  8303 1726773023.44459:  ^ task is: TASK: meta (flush_handlers)
  8303 1726773023.44460:  ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773023.44465: getting variables
  8303 1726773023.44466: in VariableManager get_vars()
  8303 1726773023.44477: Calling all_inventory to load vars for managed_node3
  8303 1726773023.44478: Calling groups_inventory to load vars for managed_node3
  8303 1726773023.44480: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773023.44491: Calling all_plugins_play to load vars for managed_node3
  8303 1726773023.44498: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773023.44500: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773023.44522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773023.44551: done with get_vars()
  8303 1726773023.44559: done getting variables
  8303 1726773023.44717: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action
  8303 1726773023.44760: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__
  8303 1726773023.44795: in VariableManager get_vars()
  8303 1726773023.44804: Calling all_inventory to load vars for managed_node3
  8303 1726773023.44805: Calling groups_inventory to load vars for managed_node3
  8303 1726773023.44807: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773023.44810: Calling all_plugins_play to load vars for managed_node3
  8303 1726773023.44811: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773023.44813: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773023.44831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773023.44839: done with get_vars()
  8303 1726773023.44846: done queuing things up, now waiting for results queue to drain
  8303 1726773023.44847: results queue empty
  8303 1726773023.44848: checking for any_errors_fatal
  8303 1726773023.44849: done checking for any_errors_fatal
  8303 1726773023.44850: checking for max_fail_percentage
  8303 1726773023.44850: done checking for max_fail_percentage
  8303 1726773023.44851: checking to see if all hosts have failed and the running result is not ok
  8303 1726773023.44851: done checking to see if all hosts have failed
  8303 1726773023.44851: getting the remaining hosts for this loop
  8303 1726773023.44852: done getting the remaining hosts for this loop
  8303 1726773023.44853: getting the next task for host managed_node3
  8303 1726773023.44856: done getting next task for host managed_node3
  8303 1726773023.44857:  ^ task is: TASK: Run role with no settings
  8303 1726773023.44858:  ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773023.44860: getting variables
  8303 1726773023.44860: in VariableManager get_vars()
  8303 1726773023.44865: Calling all_inventory to load vars for managed_node3
  8303 1726773023.44866: Calling groups_inventory to load vars for managed_node3
  8303 1726773023.44867: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773023.44869: Calling all_plugins_play to load vars for managed_node3
  8303 1726773023.44871: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773023.44872: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773023.44892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773023.44901: done with get_vars()
  8303 1726773023.44904: done getting variables

TASK [Run role with no settings] ***********************************************
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_default.yml:8
Thursday 19 September 2024  15:10:23 -0400 (0:00:00.028)       0:00:00.028 **** 
  8303 1726773023.44950: entering _queue_task() for managed_node3/include_role
  8303 1726773023.44951: Creating lock for include_role
  8303 1726773023.45157: worker is 1 (out of 1 available)
  8303 1726773023.45168: exiting _queue_task() for managed_node3/include_role
  8303 1726773023.45180: done queuing things up, now waiting for results queue to drain
  8303 1726773023.45181: waiting for pending results...
  8310 1726773023.45268: running TaskExecutor() for managed_node3/TASK: Run role with no settings
  8310 1726773023.45360: in run() - task 0affffe7-6841-6cfb-81ae-000000000006
  8310 1726773023.45375: variable 'ansible_search_path' from source: unknown
  8310 1726773023.45406: calling self._execute()
  8310 1726773023.45452: variable 'ansible_host' from source: host vars for 'managed_node3'
  8310 1726773023.45463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8310 1726773023.45472: variable 'omit' from source: magic vars
  8310 1726773023.45542: _execute() done
  8310 1726773023.45547: dumping result to json
  8310 1726773023.45552: done dumping result, returning
  8310 1726773023.45560: done running TaskExecutor() for managed_node3/TASK: Run role with no settings [0affffe7-6841-6cfb-81ae-000000000006]
  8310 1726773023.45567: sending task result for task 0affffe7-6841-6cfb-81ae-000000000006
  8310 1726773023.45594: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000006
  8310 1726773023.45598: WORKER PROCESS EXITING
  8303 1726773023.45707: no more pending results, returning what we have
  8303 1726773023.45711: in VariableManager get_vars()
  8303 1726773023.45734: Calling all_inventory to load vars for managed_node3
  8303 1726773023.45736: Calling groups_inventory to load vars for managed_node3
  8303 1726773023.45738: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773023.45746: Calling all_plugins_play to load vars for managed_node3
  8303 1726773023.45748: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773023.45751: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773023.45783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773023.45795: done with get_vars()
  8303 1726773023.45799: variable 'ansible_search_path' from source: unknown
  8303 1726773023.45846: variable 'omit' from source: magic vars
  8303 1726773023.45860: variable 'omit' from source: magic vars
  8303 1726773023.45869: variable 'omit' from source: magic vars
  8303 1726773023.45872: we have included files to process
  8303 1726773023.45872: generating all_blocks data
  8303 1726773023.45873: done generating all_blocks data
  8303 1726773023.45873: processing included file: fedora.linux_system_roles.kernel_settings
  8303 1726773023.45889: in VariableManager get_vars()
  8303 1726773023.45898: done with get_vars()
  8303 1726773023.45941: in VariableManager get_vars()
  8303 1726773023.45950: done with get_vars()
  8303 1726773023.45974: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/main.yml
  8303 1726773023.46059: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/defaults/main.yml
  8303 1726773023.46098: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/meta/main.yml
  8303 1726773023.46188: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml
  8303 1726773023.48632: trying /usr/local/lib/python3.12/site-packages/ansible/modules
  8303 1726773023.48768: in VariableManager get_vars()
  8303 1726773023.48786: done with get_vars()
  8303 1726773023.49884: in VariableManager get_vars()
  8303 1726773023.49907: done with get_vars()
  8303 1726773023.50061: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/handlers/main.yml
  8303 1726773023.50719: iterating over new_blocks loaded from include file
  8303 1726773023.50721: in VariableManager get_vars()
  8303 1726773023.50739: done with get_vars()
  8303 1726773023.50741: filtering new block on tags
  8303 1726773023.50757: done filtering new block on tags
  8303 1726773023.50760: in VariableManager get_vars()
  8303 1726773023.50773: done with get_vars()
  8303 1726773023.50774: filtering new block on tags
  8303 1726773023.50793: done filtering new block on tags
  8303 1726773023.50795: in VariableManager get_vars()
  8303 1726773023.50828: done with get_vars()
  8303 1726773023.50830: filtering new block on tags
  8303 1726773023.50867: done filtering new block on tags
  8303 1726773023.50870: in VariableManager get_vars()
  8303 1726773023.50883: done with get_vars()
  8303 1726773023.50886: filtering new block on tags
  8303 1726773023.50902: done filtering new block on tags
  8303 1726773023.50904: done iterating over new_blocks loaded from include file
  8303 1726773023.50905: extending task lists for all hosts with included blocks
  8303 1726773023.50954: done extending task lists
  8303 1726773023.50955: done processing included files
  8303 1726773023.50955: results queue empty
  8303 1726773023.50956: checking for any_errors_fatal
  8303 1726773023.50958: done checking for any_errors_fatal
  8303 1726773023.50959: checking for max_fail_percentage
  8303 1726773023.50960: done checking for max_fail_percentage
  8303 1726773023.50960: checking to see if all hosts have failed and the running result is not ok
  8303 1726773023.50961: done checking to see if all hosts have failed
  8303 1726773023.50962: getting the remaining hosts for this loop
  8303 1726773023.50963: done getting the remaining hosts for this loop
  8303 1726773023.50965: getting the next task for host managed_node3
  8303 1726773023.50968: done getting next task for host managed_node3
  8303 1726773023.50970:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values
  8303 1726773023.50972:  ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773023.50981: getting variables
  8303 1726773023.50982: in VariableManager get_vars()
  8303 1726773023.50995: Calling all_inventory to load vars for managed_node3
  8303 1726773023.50998: Calling groups_inventory to load vars for managed_node3
  8303 1726773023.51000: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773023.51005: Calling all_plugins_play to load vars for managed_node3
  8303 1726773023.51007: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773023.51010: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773023.51038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773023.51059: done with get_vars()
  8303 1726773023.51065: done getting variables
  8303 1726773023.51125: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:2
Thursday 19 September 2024  15:10:23 -0400 (0:00:00.062)       0:00:00.090 **** 
  8303 1726773023.51154: entering _queue_task() for managed_node3/fail
  8303 1726773023.51156: Creating lock for fail
  8303 1726773023.51375: worker is 1 (out of 1 available)
  8303 1726773023.51389: exiting _queue_task() for managed_node3/fail
  8303 1726773023.51400: done queuing things up, now waiting for results queue to drain
  8303 1726773023.51402: waiting for pending results...
  8312 1726773023.51582: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values
  8312 1726773023.51708: in run() - task 0affffe7-6841-6cfb-81ae-000000000023
  8312 1726773023.51725: variable 'ansible_search_path' from source: unknown
  8312 1726773023.51730: variable 'ansible_search_path' from source: unknown
  8312 1726773023.51762: calling self._execute()
  8312 1726773023.51821: variable 'ansible_host' from source: host vars for 'managed_node3'
  8312 1726773023.51830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8312 1726773023.51837: variable 'omit' from source: magic vars
  8312 1726773023.52255: variable 'kernel_settings_sysctl' from source: role '' defaults
  8312 1726773023.52266: variable '__kernel_settings_state_empty' from source: role '' all vars
  8312 1726773023.52278: Evaluated conditional (kernel_settings_sysctl != __kernel_settings_state_empty): True
  8312 1726773023.52612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name
  8312 1726773023.54782: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py
  8312 1726773023.54861: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py
  8312 1726773023.54900: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py
  8312 1726773023.54933: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py
  8312 1726773023.54959: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py
  8312 1726773023.55033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  8312 1726773023.55062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  8312 1726773023.55091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  8312 1726773023.55132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  8312 1726773023.55147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  8312 1726773023.55200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  8312 1726773023.55222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  8312 1726773023.55247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  8312 1726773023.55287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  8312 1726773023.55302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  8312 1726773023.55341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  8312 1726773023.55364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  8312 1726773023.55391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  8312 1726773023.55430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  8312 1726773023.55445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  8312 1726773023.55720: variable 'kernel_settings_sysctl' from source: role '' defaults
  8312 1726773023.55747: Evaluated conditional ((kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", true) | list | length > 0) or (kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", false) | list | length > 0)): False
  8312 1726773023.55752: when evaluation is False, skipping this task
  8312 1726773023.55756: _execute() done
  8312 1726773023.55759: dumping result to json
  8312 1726773023.55763: done dumping result, returning
  8312 1726773023.55769: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values [0affffe7-6841-6cfb-81ae-000000000023]
  8312 1726773023.55775: sending task result for task 0affffe7-6841-6cfb-81ae-000000000023
  8312 1726773023.55835: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000023
  8312 1726773023.55839: WORKER PROCESS EXITING
skipping: [managed_node3] => {
    "changed": false,
    "false_condition": "(kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", true) | list | length > 0) or (kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", false) | list | length > 0)",
    "skip_reason": "Conditional result was False"
}
  8303 1726773023.56084: no more pending results, returning what we have
  8303 1726773023.56089: results queue empty
  8303 1726773023.56090: checking for any_errors_fatal
  8303 1726773023.56091: done checking for any_errors_fatal
  8303 1726773023.56092: checking for max_fail_percentage
  8303 1726773023.56093: done checking for max_fail_percentage
  8303 1726773023.56094: checking to see if all hosts have failed and the running result is not ok
  8303 1726773023.56095: done checking to see if all hosts have failed
  8303 1726773023.56095: getting the remaining hosts for this loop
  8303 1726773023.56096: done getting the remaining hosts for this loop
  8303 1726773023.56099: getting the next task for host managed_node3
  8303 1726773023.56108: done getting next task for host managed_node3
  8303 1726773023.56110:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables
  8303 1726773023.56112:  ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773023.56125: getting variables
  8303 1726773023.56126: in VariableManager get_vars()
  8303 1726773023.56149: Calling all_inventory to load vars for managed_node3
  8303 1726773023.56151: Calling groups_inventory to load vars for managed_node3
  8303 1726773023.56152: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773023.56160: Calling all_plugins_play to load vars for managed_node3
  8303 1726773023.56162: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773023.56164: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773023.56195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773023.56216: done with get_vars()
  8303 1726773023.56222: done getting variables

TASK [fedora.linux_system_roles.kernel_settings : Set version specific variables] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:9
Thursday 19 September 2024  15:10:23 -0400 (0:00:00.051)       0:00:00.141 **** 
  8303 1726773023.56297: entering _queue_task() for managed_node3/include_tasks
  8303 1726773023.56298: Creating lock for include_tasks
  8303 1726773023.56469: worker is 1 (out of 1 available)
  8303 1726773023.56484: exiting _queue_task() for managed_node3/include_tasks
  8303 1726773023.56497: done queuing things up, now waiting for results queue to drain
  8303 1726773023.56498: waiting for pending results...
  8315 1726773023.56592: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables
  8315 1726773023.56689: in run() - task 0affffe7-6841-6cfb-81ae-000000000024
  8315 1726773023.56704: variable 'ansible_search_path' from source: unknown
  8315 1726773023.56708: variable 'ansible_search_path' from source: unknown
  8315 1726773023.56737: calling self._execute()
  8315 1726773023.56789: variable 'ansible_host' from source: host vars for 'managed_node3'
  8315 1726773023.56797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8315 1726773023.56806: variable 'omit' from source: magic vars
  8315 1726773023.56879: _execute() done
  8315 1726773023.56886: dumping result to json
  8315 1726773023.56891: done dumping result, returning
  8315 1726773023.56897: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables [0affffe7-6841-6cfb-81ae-000000000024]
  8315 1726773023.56904: sending task result for task 0affffe7-6841-6cfb-81ae-000000000024
  8315 1726773023.56926: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000024
  8315 1726773023.56930: WORKER PROCESS EXITING
  8303 1726773023.57028: no more pending results, returning what we have
  8303 1726773023.57032: in VariableManager get_vars()
  8303 1726773023.57065: Calling all_inventory to load vars for managed_node3
  8303 1726773023.57067: Calling groups_inventory to load vars for managed_node3
  8303 1726773023.57069: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773023.57076: Calling all_plugins_play to load vars for managed_node3
  8303 1726773023.57078: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773023.57080: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773023.57114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773023.57130: done with get_vars()
  8303 1726773023.57134: variable 'ansible_search_path' from source: unknown
  8303 1726773023.57134: variable 'ansible_search_path' from source: unknown
  8303 1726773023.57160: we have included files to process
  8303 1726773023.57161: generating all_blocks data
  8303 1726773023.57162: done generating all_blocks data
  8303 1726773023.57166: processing included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml
  8303 1726773023.57167: loading included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml
  8303 1726773023.57168: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml
included: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml for managed_node3
  8303 1726773023.57732: done processing included file
  8303 1726773023.57733: iterating over new_blocks loaded from include file
  8303 1726773023.57734: in VariableManager get_vars()
  8303 1726773023.57747: done with get_vars()
  8303 1726773023.57748: filtering new block on tags
  8303 1726773023.57760: done filtering new block on tags
  8303 1726773023.57761: in VariableManager get_vars()
  8303 1726773023.57773: done with get_vars()
  8303 1726773023.57774: filtering new block on tags
  8303 1726773023.57786: done filtering new block on tags
  8303 1726773023.57788: in VariableManager get_vars()
  8303 1726773023.57800: done with get_vars()
  8303 1726773023.57802: filtering new block on tags
  8303 1726773023.57812: done filtering new block on tags
  8303 1726773023.57814: in VariableManager get_vars()
  8303 1726773023.57825: done with get_vars()
  8303 1726773023.57826: filtering new block on tags
  8303 1726773023.57834: done filtering new block on tags
  8303 1726773023.57836: done iterating over new_blocks loaded from include file
  8303 1726773023.57837: extending task lists for all hosts with included blocks
  8303 1726773023.57953: done extending task lists
  8303 1726773023.57954: done processing included files
  8303 1726773023.57955: results queue empty
  8303 1726773023.57955: checking for any_errors_fatal
  8303 1726773023.57957: done checking for any_errors_fatal
  8303 1726773023.57958: checking for max_fail_percentage
  8303 1726773023.57958: done checking for max_fail_percentage
  8303 1726773023.57959: checking to see if all hosts have failed and the running result is not ok
  8303 1726773023.57959: done checking to see if all hosts have failed
  8303 1726773023.57959: getting the remaining hosts for this loop
  8303 1726773023.57960: done getting the remaining hosts for this loop
  8303 1726773023.57961: getting the next task for host managed_node3
  8303 1726773023.57964: done getting next task for host managed_node3
  8303 1726773023.57966:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role
  8303 1726773023.57968:  ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773023.57973: getting variables
  8303 1726773023.57974: in VariableManager get_vars()
  8303 1726773023.57982: Calling all_inventory to load vars for managed_node3
  8303 1726773023.57983: Calling groups_inventory to load vars for managed_node3
  8303 1726773023.57986: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773023.57991: Calling all_plugins_play to load vars for managed_node3
  8303 1726773023.57992: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773023.57994: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773023.58013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773023.58029: done with get_vars()
  8303 1726773023.58033: done getting variables

TASK [fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:2
Thursday 19 September 2024  15:10:23 -0400 (0:00:00.017)       0:00:00.159 **** 
  8303 1726773023.58078: entering _queue_task() for managed_node3/setup
  8303 1726773023.58243: worker is 1 (out of 1 available)
  8303 1726773023.58256: exiting _queue_task() for managed_node3/setup
  8303 1726773023.58267: done queuing things up, now waiting for results queue to drain
  8303 1726773023.58268: waiting for pending results...
  8316 1726773023.58447: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role
  8316 1726773023.58559: in run() - task 0affffe7-6841-6cfb-81ae-000000000085
  8316 1726773023.58574: variable 'ansible_search_path' from source: unknown
  8316 1726773023.58577: variable 'ansible_search_path' from source: unknown
  8316 1726773023.58605: calling self._execute()
  8316 1726773023.58654: variable 'ansible_host' from source: host vars for 'managed_node3'
  8316 1726773023.58663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8316 1726773023.58672: variable 'omit' from source: magic vars
  8316 1726773023.59028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name
  8316 1726773023.60735: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py
  8316 1726773023.60791: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py
  8316 1726773023.60819: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py
  8316 1726773023.60842: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py
  8316 1726773023.60860: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py
  8316 1726773023.60920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  8316 1726773023.60939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  8316 1726773023.60954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  8316 1726773023.60979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  8316 1726773023.60989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  8316 1726773023.61029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  8316 1726773023.61044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  8316 1726773023.61059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  8316 1726773023.61081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  8316 1726773023.61091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  8316 1726773023.61221: variable '__kernel_settings_required_facts' from source: role '' all vars
  8316 1726773023.61233: variable 'ansible_facts' from source: unknown
  8316 1726773023.61257: Evaluated conditional (__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0): True
  8316 1726773023.61265: variable 'omit' from source: magic vars
  8316 1726773023.61309: variable 'omit' from source: magic vars
  8316 1726773023.61329: variable '__kernel_settings_required_facts_subsets' from source: role '' all vars
  8316 1726773023.61391: variable '__kernel_settings_required_facts_subsets' from source: role '' all vars
  8316 1726773023.61462: variable '__kernel_settings_required_facts' from source: role '' all vars
  8316 1726773023.61518: variable 'omit' from source: magic vars
  8316 1726773023.61540: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  8316 1726773023.61575: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  8316 1726773023.61594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  8316 1726773023.61608: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8316 1726773023.61618: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8316 1726773023.61640: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  8316 1726773023.61645: variable 'ansible_host' from source: host vars for 'managed_node3'
  8316 1726773023.61649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8316 1726773023.61715: Set connection var ansible_pipelining to False
  8316 1726773023.61726: Set connection var ansible_timeout to 10
  8316 1726773023.61732: Set connection var ansible_module_compression to ZIP_DEFLATED
  8316 1726773023.61737: Set connection var ansible_shell_executable to /bin/sh
  8316 1726773023.61740: Set connection var ansible_connection to ssh
  8316 1726773023.61746: Set connection var ansible_shell_type to sh
  8316 1726773023.61764: variable 'ansible_shell_executable' from source: unknown
  8316 1726773023.61769: variable 'ansible_connection' from source: unknown
  8316 1726773023.61773: variable 'ansible_module_compression' from source: unknown
  8316 1726773023.61776: variable 'ansible_shell_type' from source: unknown
  8316 1726773023.61779: variable 'ansible_shell_executable' from source: unknown
  8316 1726773023.61782: variable 'ansible_host' from source: host vars for 'managed_node3'
  8316 1726773023.61787: variable 'ansible_pipelining' from source: unknown
  8316 1726773023.61790: variable 'ansible_timeout' from source: unknown
  8316 1726773023.61795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8316 1726773023.61883: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action)
  8316 1726773023.61896: variable 'omit' from source: magic vars
  8316 1726773023.61903: starting attempt loop
  8316 1726773023.61907: running the handler
  8316 1726773023.61916: _low_level_execute_command(): starting
  8316 1726773023.61923: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  8316 1726773023.64395: stderr chunk (state=2):
>>>Warning: Permanently added '10.31.47.99' (ECDSA) to the list of known hosts.
<<<
  8316 1726773023.77490: stdout chunk (state=3):
>>>/root
<<<
  8316 1726773023.77719: stderr chunk (state=3):
>>><<<
  8316 1726773023.77727: stdout chunk (state=3):
>>><<<
  8316 1726773023.77746: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=Warning: Permanently added '10.31.47.99' (ECDSA) to the list of known hosts.
  8316 1726773023.77760: _low_level_execute_command(): starting
  8316 1726773023.77767: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773023.7775488-8316-280416202011683 `" && echo ansible-tmp-1726773023.7775488-8316-280416202011683="` echo /root/.ansible/tmp/ansible-tmp-1726773023.7775488-8316-280416202011683 `" ) && sleep 0'
  8316 1726773023.80494: stdout chunk (state=2):
>>>ansible-tmp-1726773023.7775488-8316-280416202011683=/root/.ansible/tmp/ansible-tmp-1726773023.7775488-8316-280416202011683
<<<
  8316 1726773023.80625: stderr chunk (state=3):
>>><<<
  8316 1726773023.80634: stdout chunk (state=3):
>>><<<
  8316 1726773023.80650: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773023.7775488-8316-280416202011683=/root/.ansible/tmp/ansible-tmp-1726773023.7775488-8316-280416202011683
, stderr=
  8316 1726773023.80693: variable 'ansible_module_compression' from source: unknown
  8316 1726773023.80736: ANSIBALLZ: Using lock for setup
  8316 1726773023.80740: ANSIBALLZ: Acquiring lock
  8316 1726773023.80744: ANSIBALLZ: Lock acquired: 140242353218592
  8316 1726773023.80748: ANSIBALLZ: Creating module
  8316 1726773024.02744: ANSIBALLZ: Writing module into payload
  8316 1726773024.02927: ANSIBALLZ: Writing module
  8316 1726773024.02959: ANSIBALLZ: Renaming module
  8316 1726773024.02967: ANSIBALLZ: Done creating module
  8316 1726773024.03007: variable 'ansible_facts' from source: unknown
  8316 1726773024.03016: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  8316 1726773024.03026: _low_level_execute_command(): starting
  8316 1726773024.03033: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'python3.6'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'/usr/libexec/platform-python'"'"'; command -v '"'"'python2.7'"'"'; command -v '"'"'/usr/bin/python'"'"'; command -v '"'"'python'"'"'; echo ENDFOUND && sleep 0'
  8316 1726773024.05704: stdout chunk (state=2):
>>>PLATFORM
Linux
FOUND
/usr/bin/python3.12
/usr/bin/python3.6
/usr/bin/python3
/usr/libexec/platform-python
<<<
  8316 1726773024.05797: stdout chunk (state=3):
>>>ENDFOUND
<<<
  8316 1726773024.06079: stderr chunk (state=3):
>>><<<
  8316 1726773024.06088: stdout chunk (state=3):
>>><<<
  8316 1726773024.06105: _low_level_execute_command() done: rc=0, stdout=PLATFORM
Linux
FOUND
/usr/bin/python3.12
/usr/bin/python3.6
/usr/bin/python3
/usr/libexec/platform-python
ENDFOUND
, stderr=
  8316 1726773024.06113 [managed_node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3.6', '/usr/bin/python3', '/usr/libexec/platform-python']
  8316 1726773024.06150: _low_level_execute_command(): starting
  8316 1726773024.06161: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0'
  8316 1726773024.06600: Sending initial data
  8316 1726773024.06608: Sent initial data (1234 bytes)
  8316 1726773024.10295: stdout chunk (state=3):
>>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"8\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"8\"\nPLATFORM_ID=\"platform:el8\"\nPRETTY_NAME=\"CentOS Stream 8\"\nANSI_COLOR=\"0;31\"\nCPE_NAME=\"cpe:/o:centos:centos:8\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 8\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"}
<<<
  8316 1726773024.10767: stderr chunk (state=3):
>>><<<
  8316 1726773024.10775: stdout chunk (state=3):
>>><<<
  8316 1726773024.10790: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"8\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"8\"\nPLATFORM_ID=\"platform:el8\"\nPRETTY_NAME=\"CentOS Stream 8\"\nANSI_COLOR=\"0;31\"\nCPE_NAME=\"cpe:/o:centos:centos:8\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 8\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"}
, stderr=
  8316 1726773024.10832: variable 'ansible_facts' from source: unknown
  8316 1726773024.10839: variable 'ansible_facts' from source: unknown
  8316 1726773024.10850: variable 'ansible_module_compression' from source: unknown
  8316 1726773024.10884: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED
  8316 1726773024.10911: variable 'ansible_facts' from source: unknown
  8316 1726773024.11078: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773023.7775488-8316-280416202011683/AnsiballZ_setup.py
  8316 1726773024.11195: Sending initial data
  8316 1726773024.11202: Sent initial data (152 bytes)
  8316 1726773024.14584: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmp8s3j1lrd /root/.ansible/tmp/ansible-tmp-1726773023.7775488-8316-280416202011683/AnsiballZ_setup.py
<<<
  8316 1726773024.17613: stderr chunk (state=3):
>>><<<
  8316 1726773024.17625: stdout chunk (state=3):
>>><<<
  8316 1726773024.17651: done transferring module to remote
  8316 1726773024.17667: _low_level_execute_command(): starting
  8316 1726773024.17672: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773023.7775488-8316-280416202011683/ /root/.ansible/tmp/ansible-tmp-1726773023.7775488-8316-280416202011683/AnsiballZ_setup.py && sleep 0'
  8316 1726773024.20393: stderr chunk (state=2):
>>><<<
  8316 1726773024.20405: stdout chunk (state=2):
>>><<<
  8316 1726773024.20424: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8316 1726773024.20430: _low_level_execute_command(): starting
  8316 1726773024.20436: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773023.7775488-8316-280416202011683/AnsiballZ_setup.py && sleep 0'
  8316 1726773024.47441: stdout chunk (state=2):
>>>
{"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "8", "ansible_distribution_major_version": "8", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "gather_subset": ["!all", "!min", "distribution", "distribution_major_version", "distribution_version", "os_family"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["!all", "!min", "distribution", "distribution_major_version", "distribution_version", "os_family"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}}
<<<
  8316 1726773024.49131: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  8316 1726773024.49142: stdout chunk (state=3):
>>><<<
  8316 1726773024.49152: stderr chunk (state=3):
>>><<<
  8316 1726773024.49166: _low_level_execute_command() done: rc=0, stdout=
{"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "8", "ansible_distribution_major_version": "8", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "gather_subset": ["!all", "!min", "distribution", "distribution_major_version", "distribution_version", "os_family"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["!all", "!min", "distribution", "distribution_major_version", "distribution_version", "os_family"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}}
, stderr=Shared connection to 10.31.47.99 closed.
  8316 1726773024.49201: done with _execute_module (setup, {'gather_subset': ['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'], '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773023.7775488-8316-280416202011683/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  8316 1726773024.49221: _low_level_execute_command(): starting
  8316 1726773024.49227: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773023.7775488-8316-280416202011683/ > /dev/null 2>&1 && sleep 0'
  8316 1726773024.51723: stderr chunk (state=2):
>>><<<
  8316 1726773024.51733: stdout chunk (state=2):
>>><<<
  8316 1726773024.51748: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8316 1726773024.51757: handler run complete
  8316 1726773024.51770: variable 'ansible_facts' from source: unknown
  8316 1726773024.51801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8316 1726773024.51836: variable 'ansible_facts' from source: unknown
  8316 1726773024.51860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8316 1726773024.51875: attempt loop complete, returning result
  8316 1726773024.51879: _execute() done
  8316 1726773024.51882: dumping result to json
  8316 1726773024.51889: done dumping result, returning
  8316 1726773024.51897: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role [0affffe7-6841-6cfb-81ae-000000000085]
  8316 1726773024.51903: sending task result for task 0affffe7-6841-6cfb-81ae-000000000085
  8316 1726773024.51933: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000085
  8316 1726773024.51936: WORKER PROCESS EXITING
ok: [managed_node3]
  8303 1726773024.52081: no more pending results, returning what we have
  8303 1726773024.52083: results queue empty
  8303 1726773024.52084: checking for any_errors_fatal
  8303 1726773024.52087: done checking for any_errors_fatal
  8303 1726773024.52088: checking for max_fail_percentage
  8303 1726773024.52089: done checking for max_fail_percentage
  8303 1726773024.52090: checking to see if all hosts have failed and the running result is not ok
  8303 1726773024.52090: done checking to see if all hosts have failed
  8303 1726773024.52091: getting the remaining hosts for this loop
  8303 1726773024.52092: done getting the remaining hosts for this loop
  8303 1726773024.52095: getting the next task for host managed_node3
  8303 1726773024.52103: done getting next task for host managed_node3
  8303 1726773024.52106:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree
  8303 1726773024.52110:  ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773024.52118: getting variables
  8303 1726773024.52119: in VariableManager get_vars()
  8303 1726773024.52148: Calling all_inventory to load vars for managed_node3
  8303 1726773024.52150: Calling groups_inventory to load vars for managed_node3
  8303 1726773024.52152: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773024.52160: Calling all_plugins_play to load vars for managed_node3
  8303 1726773024.52163: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773024.52165: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773024.52222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773024.52251: done with get_vars()
  8303 1726773024.52258: done getting variables

TASK [fedora.linux_system_roles.kernel_settings : Check if system is ostree] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:10
Thursday 19 September 2024  15:10:24 -0400 (0:00:00.942)       0:00:01.101 **** 
  8303 1726773024.52327: entering _queue_task() for managed_node3/stat
  8303 1726773024.52488: worker is 1 (out of 1 available)
  8303 1726773024.52502: exiting _queue_task() for managed_node3/stat
  8303 1726773024.52513: done queuing things up, now waiting for results queue to drain
  8303 1726773024.52515: waiting for pending results...
  8344 1726773024.52614: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree
  8344 1726773024.52729: in run() - task 0affffe7-6841-6cfb-81ae-000000000087
  8344 1726773024.52744: variable 'ansible_search_path' from source: unknown
  8344 1726773024.52749: variable 'ansible_search_path' from source: unknown
  8344 1726773024.52778: calling self._execute()
  8344 1726773024.52827: variable 'ansible_host' from source: host vars for 'managed_node3'
  8344 1726773024.52835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8344 1726773024.52845: variable 'omit' from source: magic vars
  8344 1726773024.53173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name
  8344 1726773024.53346: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py
  8344 1726773024.53382: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py
  8344 1726773024.53430: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py
  8344 1726773024.53452: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py
  8344 1726773024.53518: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False)
  8344 1726773024.53539: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False)
  8344 1726773024.53560: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False)
  8344 1726773024.53580: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False)
  8344 1726773024.53672: Evaluated conditional (not __kernel_settings_is_ostree is defined): True
  8344 1726773024.53681: variable 'omit' from source: magic vars
  8344 1726773024.53722: variable 'omit' from source: magic vars
  8344 1726773024.53744: variable 'omit' from source: magic vars
  8344 1726773024.53762: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  8344 1726773024.53781: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  8344 1726773024.53805: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  8344 1726773024.53819: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8344 1726773024.53830: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8344 1726773024.53852: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  8344 1726773024.53856: variable 'ansible_host' from source: host vars for 'managed_node3'
  8344 1726773024.53859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8344 1726773024.53920: Set connection var ansible_pipelining to False
  8344 1726773024.53929: Set connection var ansible_timeout to 10
  8344 1726773024.53934: Set connection var ansible_module_compression to ZIP_DEFLATED
  8344 1726773024.53938: Set connection var ansible_shell_executable to /bin/sh
  8344 1726773024.53939: Set connection var ansible_connection to ssh
  8344 1726773024.53945: Set connection var ansible_shell_type to sh
  8344 1726773024.53959: variable 'ansible_shell_executable' from source: unknown
  8344 1726773024.53963: variable 'ansible_connection' from source: unknown
  8344 1726773024.53964: variable 'ansible_module_compression' from source: unknown
  8344 1726773024.53966: variable 'ansible_shell_type' from source: unknown
  8344 1726773024.53968: variable 'ansible_shell_executable' from source: unknown
  8344 1726773024.53969: variable 'ansible_host' from source: host vars for 'managed_node3'
  8344 1726773024.53971: variable 'ansible_pipelining' from source: unknown
  8344 1726773024.53973: variable 'ansible_timeout' from source: unknown
  8344 1726773024.53975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8344 1726773024.54064: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action)
  8344 1726773024.54071: variable 'omit' from source: magic vars
  8344 1726773024.54075: starting attempt loop
  8344 1726773024.54077: running the handler
  8344 1726773024.54087: _low_level_execute_command(): starting
  8344 1726773024.54093: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  8344 1726773024.56446: stdout chunk (state=2):
>>>/root
<<<
  8344 1726773024.56570: stderr chunk (state=3):
>>><<<
  8344 1726773024.56578: stdout chunk (state=3):
>>><<<
  8344 1726773024.56597: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  8344 1726773024.56611: _low_level_execute_command(): starting
  8344 1726773024.56617: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773024.5660508-8344-233885081925994 `" && echo ansible-tmp-1726773024.5660508-8344-233885081925994="` echo /root/.ansible/tmp/ansible-tmp-1726773024.5660508-8344-233885081925994 `" ) && sleep 0'
  8344 1726773024.59116: stdout chunk (state=2):
>>>ansible-tmp-1726773024.5660508-8344-233885081925994=/root/.ansible/tmp/ansible-tmp-1726773024.5660508-8344-233885081925994
<<<
  8344 1726773024.59246: stderr chunk (state=3):
>>><<<
  8344 1726773024.59253: stdout chunk (state=3):
>>><<<
  8344 1726773024.59273: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773024.5660508-8344-233885081925994=/root/.ansible/tmp/ansible-tmp-1726773024.5660508-8344-233885081925994
, stderr=
  8344 1726773024.59312: variable 'ansible_module_compression' from source: unknown
  8344 1726773024.59359: ANSIBALLZ: Using lock for stat
  8344 1726773024.59364: ANSIBALLZ: Acquiring lock
  8344 1726773024.59368: ANSIBALLZ: Lock acquired: 140242353218256
  8344 1726773024.59371: ANSIBALLZ: Creating module
  8344 1726773024.68142: ANSIBALLZ: Writing module into payload
  8344 1726773024.68230: ANSIBALLZ: Writing module
  8344 1726773024.68249: ANSIBALLZ: Renaming module
  8344 1726773024.68259: ANSIBALLZ: Done creating module
  8344 1726773024.68274: variable 'ansible_facts' from source: unknown
  8344 1726773024.68336: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773024.5660508-8344-233885081925994/AnsiballZ_stat.py
  8344 1726773024.68440: Sending initial data
  8344 1726773024.68447: Sent initial data (151 bytes)
  8344 1726773024.71089: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmph7_6ero_ /root/.ansible/tmp/ansible-tmp-1726773024.5660508-8344-233885081925994/AnsiballZ_stat.py
<<<
  8344 1726773024.72282: stderr chunk (state=3):
>>><<<
  8344 1726773024.72294: stdout chunk (state=3):
>>><<<
  8344 1726773024.72313: done transferring module to remote
  8344 1726773024.72323: _low_level_execute_command(): starting
  8344 1726773024.72329: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773024.5660508-8344-233885081925994/ /root/.ansible/tmp/ansible-tmp-1726773024.5660508-8344-233885081925994/AnsiballZ_stat.py && sleep 0'
  8344 1726773024.74722: stderr chunk (state=2):
>>><<<
  8344 1726773024.74734: stdout chunk (state=2):
>>><<<
  8344 1726773024.74752: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8344 1726773024.74757: _low_level_execute_command(): starting
  8344 1726773024.74763: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773024.5660508-8344-233885081925994/AnsiballZ_stat.py && sleep 0'
  8344 1726773024.89863: stdout chunk (state=2):
>>>
{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}}
<<<
  8344 1726773024.90929: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  8344 1726773024.90978: stderr chunk (state=3):
>>><<<
  8344 1726773024.90986: stdout chunk (state=3):
>>><<<
  8344 1726773024.91002: _low_level_execute_command() done: rc=0, stdout=
{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}}
, stderr=Shared connection to 10.31.47.99 closed.
  8344 1726773024.91029: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773024.5660508-8344-233885081925994/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  8344 1726773024.91040: _low_level_execute_command(): starting
  8344 1726773024.91046: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773024.5660508-8344-233885081925994/ > /dev/null 2>&1 && sleep 0'
  8344 1726773024.93499: stderr chunk (state=2):
>>><<<
  8344 1726773024.93509: stdout chunk (state=2):
>>><<<
  8344 1726773024.93524: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8344 1726773024.93532: handler run complete
  8344 1726773024.93548: attempt loop complete, returning result
  8344 1726773024.93552: _execute() done
  8344 1726773024.93558: dumping result to json
  8344 1726773024.93563: done dumping result, returning
  8344 1726773024.93572: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree [0affffe7-6841-6cfb-81ae-000000000087]
  8344 1726773024.93577: sending task result for task 0affffe7-6841-6cfb-81ae-000000000087
  8344 1726773024.93607: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000087
  8344 1726773024.93611: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "changed": false,
    "stat": {
        "exists": false
    }
}
  8303 1726773024.93743: no more pending results, returning what we have
  8303 1726773024.93745: results queue empty
  8303 1726773024.93746: checking for any_errors_fatal
  8303 1726773024.93752: done checking for any_errors_fatal
  8303 1726773024.93753: checking for max_fail_percentage
  8303 1726773024.93754: done checking for max_fail_percentage
  8303 1726773024.93755: checking to see if all hosts have failed and the running result is not ok
  8303 1726773024.93755: done checking to see if all hosts have failed
  8303 1726773024.93756: getting the remaining hosts for this loop
  8303 1726773024.93757: done getting the remaining hosts for this loop
  8303 1726773024.93760: getting the next task for host managed_node3
  8303 1726773024.93765: done getting next task for host managed_node3
  8303 1726773024.93767:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree
  8303 1726773024.93771:  ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773024.93779: getting variables
  8303 1726773024.93780: in VariableManager get_vars()
  8303 1726773024.93812: Calling all_inventory to load vars for managed_node3
  8303 1726773024.93815: Calling groups_inventory to load vars for managed_node3
  8303 1726773024.93816: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773024.93825: Calling all_plugins_play to load vars for managed_node3
  8303 1726773024.93827: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773024.93829: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773024.93874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773024.93909: done with get_vars()
  8303 1726773024.93916: done getting variables
  8303 1726773024.93984: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:15
Thursday 19 September 2024  15:10:24 -0400 (0:00:00.416)       0:00:01.518 **** 
  8303 1726773024.94012: entering _queue_task() for managed_node3/set_fact
  8303 1726773024.94013: Creating lock for set_fact
  8303 1726773024.94181: worker is 1 (out of 1 available)
  8303 1726773024.94198: exiting _queue_task() for managed_node3/set_fact
  8303 1726773024.94209: done queuing things up, now waiting for results queue to drain
  8303 1726773024.94210: waiting for pending results...
  8352 1726773024.94311: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree
  8352 1726773024.94426: in run() - task 0affffe7-6841-6cfb-81ae-000000000088
  8352 1726773024.94441: variable 'ansible_search_path' from source: unknown
  8352 1726773024.94444: variable 'ansible_search_path' from source: unknown
  8352 1726773024.94471: calling self._execute()
  8352 1726773024.94520: variable 'ansible_host' from source: host vars for 'managed_node3'
  8352 1726773024.94528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8352 1726773024.94537: variable 'omit' from source: magic vars
  8352 1726773024.94866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name
  8352 1726773024.95070: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py
  8352 1726773024.95106: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py
  8352 1726773024.95132: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py
  8352 1726773024.95161: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py
  8352 1726773024.95223: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False)
  8352 1726773024.95243: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False)
  8352 1726773024.95265: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False)
  8352 1726773024.95286: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False)
  8352 1726773024.95375: Evaluated conditional (not __kernel_settings_is_ostree is defined): True
  8352 1726773024.95383: variable 'omit' from source: magic vars
  8352 1726773024.95423: variable 'omit' from source: magic vars
  8352 1726773024.95504: variable '__ostree_booted_stat' from source: set_fact
  8352 1726773024.95543: variable 'omit' from source: magic vars
  8352 1726773024.95564: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  8352 1726773024.95586: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  8352 1726773024.95603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  8352 1726773024.95617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8352 1726773024.95626: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8352 1726773024.95649: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  8352 1726773024.95654: variable 'ansible_host' from source: host vars for 'managed_node3'
  8352 1726773024.95661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8352 1726773024.95725: Set connection var ansible_pipelining to False
  8352 1726773024.95736: Set connection var ansible_timeout to 10
  8352 1726773024.95742: Set connection var ansible_module_compression to ZIP_DEFLATED
  8352 1726773024.95748: Set connection var ansible_shell_executable to /bin/sh
  8352 1726773024.95751: Set connection var ansible_connection to ssh
  8352 1726773024.95759: Set connection var ansible_shell_type to sh
  8352 1726773024.95775: variable 'ansible_shell_executable' from source: unknown
  8352 1726773024.95779: variable 'ansible_connection' from source: unknown
  8352 1726773024.95782: variable 'ansible_module_compression' from source: unknown
  8352 1726773024.95786: variable 'ansible_shell_type' from source: unknown
  8352 1726773024.95790: variable 'ansible_shell_executable' from source: unknown
  8352 1726773024.95793: variable 'ansible_host' from source: host vars for 'managed_node3'
  8352 1726773024.95798: variable 'ansible_pipelining' from source: unknown
  8352 1726773024.95801: variable 'ansible_timeout' from source: unknown
  8352 1726773024.95805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8352 1726773024.95869: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  8352 1726773024.95879: variable 'omit' from source: magic vars
  8352 1726773024.95886: starting attempt loop
  8352 1726773024.95890: running the handler
  8352 1726773024.95898: handler run complete
  8352 1726773024.95905: attempt loop complete, returning result
  8352 1726773024.95908: _execute() done
  8352 1726773024.95911: dumping result to json
  8352 1726773024.95914: done dumping result, returning
  8352 1726773024.95920: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree [0affffe7-6841-6cfb-81ae-000000000088]
  8352 1726773024.95927: sending task result for task 0affffe7-6841-6cfb-81ae-000000000088
  8352 1726773024.95949: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000088
  8352 1726773024.95952: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "ansible_facts": {
        "__kernel_settings_is_ostree": false
    },
    "changed": false
}
  8303 1726773024.96104: no more pending results, returning what we have
  8303 1726773024.96107: results queue empty
  8303 1726773024.96108: checking for any_errors_fatal
  8303 1726773024.96111: done checking for any_errors_fatal
  8303 1726773024.96112: checking for max_fail_percentage
  8303 1726773024.96113: done checking for max_fail_percentage
  8303 1726773024.96114: checking to see if all hosts have failed and the running result is not ok
  8303 1726773024.96114: done checking to see if all hosts have failed
  8303 1726773024.96115: getting the remaining hosts for this loop
  8303 1726773024.96116: done getting the remaining hosts for this loop
  8303 1726773024.96119: getting the next task for host managed_node3
  8303 1726773024.96126: done getting next task for host managed_node3
  8303 1726773024.96129:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin
  8303 1726773024.96132:  ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773024.96140: getting variables
  8303 1726773024.96141: in VariableManager get_vars()
  8303 1726773024.96176: Calling all_inventory to load vars for managed_node3
  8303 1726773024.96179: Calling groups_inventory to load vars for managed_node3
  8303 1726773024.96181: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773024.96191: Calling all_plugins_play to load vars for managed_node3
  8303 1726773024.96193: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773024.96196: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773024.96241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773024.96282: done with get_vars()
  8303 1726773024.96292: done getting variables

TASK [fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:22
Thursday 19 September 2024  15:10:24 -0400 (0:00:00.023)       0:00:01.542 **** 
  8303 1726773024.96381: entering _queue_task() for managed_node3/stat
  8303 1726773024.96573: worker is 1 (out of 1 available)
  8303 1726773024.96589: exiting _queue_task() for managed_node3/stat
  8303 1726773024.96599: done queuing things up, now waiting for results queue to drain
  8303 1726773024.96600: waiting for pending results...
  8355 1726773024.96795: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin
  8355 1726773024.96930: in run() - task 0affffe7-6841-6cfb-81ae-00000000008a
  8355 1726773024.96947: variable 'ansible_search_path' from source: unknown
  8355 1726773024.96951: variable 'ansible_search_path' from source: unknown
  8355 1726773024.96983: calling self._execute()
  8355 1726773024.97045: variable 'ansible_host' from source: host vars for 'managed_node3'
  8355 1726773024.97055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8355 1726773024.97063: variable 'omit' from source: magic vars
  8355 1726773024.97555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name
  8355 1726773024.97797: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py
  8355 1726773024.97839: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py
  8355 1726773024.97870: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py
  8355 1726773024.97903: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py
  8355 1726773024.97978: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False)
  8355 1726773024.98005: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False)
  8355 1726773024.98029: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False)
  8355 1726773024.98055: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False)
  8355 1726773024.98171: Evaluated conditional (not __kernel_settings_is_transactional is defined): True
  8355 1726773024.98180: variable 'omit' from source: magic vars
  8355 1726773024.98237: variable 'omit' from source: magic vars
  8355 1726773024.98269: variable 'omit' from source: magic vars
  8355 1726773024.98297: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  8355 1726773024.98323: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  8355 1726773024.98343: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  8355 1726773024.98360: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8355 1726773024.98372: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8355 1726773024.98402: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  8355 1726773024.98408: variable 'ansible_host' from source: host vars for 'managed_node3'
  8355 1726773024.98412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8355 1726773024.98503: Set connection var ansible_pipelining to False
  8355 1726773024.98514: Set connection var ansible_timeout to 10
  8355 1726773024.98520: Set connection var ansible_module_compression to ZIP_DEFLATED
  8355 1726773024.98525: Set connection var ansible_shell_executable to /bin/sh
  8355 1726773024.98528: Set connection var ansible_connection to ssh
  8355 1726773024.98535: Set connection var ansible_shell_type to sh
  8355 1726773024.98555: variable 'ansible_shell_executable' from source: unknown
  8355 1726773024.98560: variable 'ansible_connection' from source: unknown
  8355 1726773024.98563: variable 'ansible_module_compression' from source: unknown
  8355 1726773024.98566: variable 'ansible_shell_type' from source: unknown
  8355 1726773024.98569: variable 'ansible_shell_executable' from source: unknown
  8355 1726773024.98572: variable 'ansible_host' from source: host vars for 'managed_node3'
  8355 1726773024.98575: variable 'ansible_pipelining' from source: unknown
  8355 1726773024.98578: variable 'ansible_timeout' from source: unknown
  8355 1726773024.98582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8355 1726773024.98706: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action)
  8355 1726773024.98717: variable 'omit' from source: magic vars
  8355 1726773024.98723: starting attempt loop
  8355 1726773024.98727: running the handler
  8355 1726773024.98738: _low_level_execute_command(): starting
  8355 1726773024.98745: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  8355 1726773025.01245: stdout chunk (state=2):
>>>/root
<<<
  8355 1726773025.01368: stderr chunk (state=3):
>>><<<
  8355 1726773025.01376: stdout chunk (state=3):
>>><<<
  8355 1726773025.01396: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  8355 1726773025.01409: _low_level_execute_command(): starting
  8355 1726773025.01415: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773025.0140343-8355-137237898179181 `" && echo ansible-tmp-1726773025.0140343-8355-137237898179181="` echo /root/.ansible/tmp/ansible-tmp-1726773025.0140343-8355-137237898179181 `" ) && sleep 0'
  8355 1726773025.03963: stdout chunk (state=2):
>>>ansible-tmp-1726773025.0140343-8355-137237898179181=/root/.ansible/tmp/ansible-tmp-1726773025.0140343-8355-137237898179181
<<<
  8355 1726773025.04096: stderr chunk (state=3):
>>><<<
  8355 1726773025.04103: stdout chunk (state=3):
>>><<<
  8355 1726773025.04121: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773025.0140343-8355-137237898179181=/root/.ansible/tmp/ansible-tmp-1726773025.0140343-8355-137237898179181
, stderr=
  8355 1726773025.04162: variable 'ansible_module_compression' from source: unknown
  8355 1726773025.04209: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED
  8355 1726773025.04239: variable 'ansible_facts' from source: unknown
  8355 1726773025.04310: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773025.0140343-8355-137237898179181/AnsiballZ_stat.py
  8355 1726773025.04413: Sending initial data
  8355 1726773025.04420: Sent initial data (151 bytes)
  8355 1726773025.07046: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmp1sl6c7mt /root/.ansible/tmp/ansible-tmp-1726773025.0140343-8355-137237898179181/AnsiballZ_stat.py
<<<
  8355 1726773025.08246: stderr chunk (state=3):
>>><<<
  8355 1726773025.08258: stdout chunk (state=3):
>>><<<
  8355 1726773025.08277: done transferring module to remote
  8355 1726773025.08287: _low_level_execute_command(): starting
  8355 1726773025.08292: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773025.0140343-8355-137237898179181/ /root/.ansible/tmp/ansible-tmp-1726773025.0140343-8355-137237898179181/AnsiballZ_stat.py && sleep 0'
  8355 1726773025.10733: stderr chunk (state=2):
>>><<<
  8355 1726773025.10746: stdout chunk (state=2):
>>><<<
  8355 1726773025.10764: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8355 1726773025.10769: _low_level_execute_command(): starting
  8355 1726773025.10774: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773025.0140343-8355-137237898179181/AnsiballZ_stat.py && sleep 0'
  8355 1726773025.25822: stdout chunk (state=2):
>>>
{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/sbin/transactional-update", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}}
<<<
  8355 1726773025.26877: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  8355 1726773025.26928: stderr chunk (state=3):
>>><<<
  8355 1726773025.26935: stdout chunk (state=3):
>>><<<
  8355 1726773025.26951: _low_level_execute_command() done: rc=0, stdout=
{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/sbin/transactional-update", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}}
, stderr=Shared connection to 10.31.47.99 closed.
  8355 1726773025.26981: done with _execute_module (stat, {'path': '/sbin/transactional-update', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773025.0140343-8355-137237898179181/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  8355 1726773025.26992: _low_level_execute_command(): starting
  8355 1726773025.26998: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773025.0140343-8355-137237898179181/ > /dev/null 2>&1 && sleep 0'
  8355 1726773025.29511: stderr chunk (state=2):
>>><<<
  8355 1726773025.29523: stdout chunk (state=2):
>>><<<
  8355 1726773025.29540: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8355 1726773025.29546: handler run complete
  8355 1726773025.29562: attempt loop complete, returning result
  8355 1726773025.29566: _execute() done
  8355 1726773025.29569: dumping result to json
  8355 1726773025.29574: done dumping result, returning
  8355 1726773025.29582: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin [0affffe7-6841-6cfb-81ae-00000000008a]
  8355 1726773025.29588: sending task result for task 0affffe7-6841-6cfb-81ae-00000000008a
  8355 1726773025.29618: done sending task result for task 0affffe7-6841-6cfb-81ae-00000000008a
  8355 1726773025.29621: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "changed": false,
    "stat": {
        "exists": false
    }
}
  8303 1726773025.29764: no more pending results, returning what we have
  8303 1726773025.29767: results queue empty
  8303 1726773025.29768: checking for any_errors_fatal
  8303 1726773025.29773: done checking for any_errors_fatal
  8303 1726773025.29773: checking for max_fail_percentage
  8303 1726773025.29775: done checking for max_fail_percentage
  8303 1726773025.29775: checking to see if all hosts have failed and the running result is not ok
  8303 1726773025.29776: done checking to see if all hosts have failed
  8303 1726773025.29776: getting the remaining hosts for this loop
  8303 1726773025.29777: done getting the remaining hosts for this loop
  8303 1726773025.29781: getting the next task for host managed_node3
  8303 1726773025.29788: done getting next task for host managed_node3
  8303 1726773025.29791:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists
  8303 1726773025.29794:  ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773025.29805: getting variables
  8303 1726773025.29806: in VariableManager get_vars()
  8303 1726773025.29835: Calling all_inventory to load vars for managed_node3
  8303 1726773025.29838: Calling groups_inventory to load vars for managed_node3
  8303 1726773025.29840: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773025.29848: Calling all_plugins_play to load vars for managed_node3
  8303 1726773025.29850: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773025.29852: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773025.29894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773025.29922: done with get_vars()
  8303 1726773025.29928: done getting variables
  8303 1726773025.29972: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:27
Thursday 19 September 2024  15:10:25 -0400 (0:00:00.336)       0:00:01.878 **** 
  8303 1726773025.30001: entering _queue_task() for managed_node3/set_fact
  8303 1726773025.30204: worker is 1 (out of 1 available)
  8303 1726773025.30217: exiting _queue_task() for managed_node3/set_fact
  8303 1726773025.30231: done queuing things up, now waiting for results queue to drain
  8303 1726773025.30232: waiting for pending results...
  8368 1726773025.30332: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists
  8368 1726773025.30448: in run() - task 0affffe7-6841-6cfb-81ae-00000000008b
  8368 1726773025.30464: variable 'ansible_search_path' from source: unknown
  8368 1726773025.30469: variable 'ansible_search_path' from source: unknown
  8368 1726773025.30497: calling self._execute()
  8368 1726773025.30545: variable 'ansible_host' from source: host vars for 'managed_node3'
  8368 1726773025.30553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8368 1726773025.30562: variable 'omit' from source: magic vars
  8368 1726773025.30880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name
  8368 1726773025.31052: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py
  8368 1726773025.31087: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py
  8368 1726773025.31113: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py
  8368 1726773025.31142: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py
  8368 1726773025.31203: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False)
  8368 1726773025.31224: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False)
  8368 1726773025.31244: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False)
  8368 1726773025.31265: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False)
  8368 1726773025.31354: Evaluated conditional (not __kernel_settings_is_transactional is defined): True
  8368 1726773025.31363: variable 'omit' from source: magic vars
  8368 1726773025.31403: variable 'omit' from source: magic vars
  8368 1726773025.31479: variable '__transactional_update_stat' from source: set_fact
  8368 1726773025.31520: variable 'omit' from source: magic vars
  8368 1726773025.31541: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  8368 1726773025.31562: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  8368 1726773025.31578: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  8368 1726773025.31594: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8368 1726773025.31603: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8368 1726773025.31625: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  8368 1726773025.31630: variable 'ansible_host' from source: host vars for 'managed_node3'
  8368 1726773025.31634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8368 1726773025.31700: Set connection var ansible_pipelining to False
  8368 1726773025.31710: Set connection var ansible_timeout to 10
  8368 1726773025.31716: Set connection var ansible_module_compression to ZIP_DEFLATED
  8368 1726773025.31722: Set connection var ansible_shell_executable to /bin/sh
  8368 1726773025.31725: Set connection var ansible_connection to ssh
  8368 1726773025.31731: Set connection var ansible_shell_type to sh
  8368 1726773025.31746: variable 'ansible_shell_executable' from source: unknown
  8368 1726773025.31750: variable 'ansible_connection' from source: unknown
  8368 1726773025.31753: variable 'ansible_module_compression' from source: unknown
  8368 1726773025.31756: variable 'ansible_shell_type' from source: unknown
  8368 1726773025.31759: variable 'ansible_shell_executable' from source: unknown
  8368 1726773025.31762: variable 'ansible_host' from source: host vars for 'managed_node3'
  8368 1726773025.31766: variable 'ansible_pipelining' from source: unknown
  8368 1726773025.31768: variable 'ansible_timeout' from source: unknown
  8368 1726773025.31770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8368 1726773025.31830: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  8368 1726773025.31838: variable 'omit' from source: magic vars
  8368 1726773025.31842: starting attempt loop
  8368 1726773025.31844: running the handler
  8368 1726773025.31850: handler run complete
  8368 1726773025.31856: attempt loop complete, returning result
  8368 1726773025.31858: _execute() done
  8368 1726773025.31860: dumping result to json
  8368 1726773025.31862: done dumping result, returning
  8368 1726773025.31866: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists [0affffe7-6841-6cfb-81ae-00000000008b]
  8368 1726773025.31870: sending task result for task 0affffe7-6841-6cfb-81ae-00000000008b
  8368 1726773025.31889: done sending task result for task 0affffe7-6841-6cfb-81ae-00000000008b
  8368 1726773025.31892: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "ansible_facts": {
        "__kernel_settings_is_transactional": false
    },
    "changed": false
}
  8303 1726773025.32125: no more pending results, returning what we have
  8303 1726773025.32129: results queue empty
  8303 1726773025.32130: checking for any_errors_fatal
  8303 1726773025.32133: done checking for any_errors_fatal
  8303 1726773025.32134: checking for max_fail_percentage
  8303 1726773025.32135: done checking for max_fail_percentage
  8303 1726773025.32135: checking to see if all hosts have failed and the running result is not ok
  8303 1726773025.32135: done checking to see if all hosts have failed
  8303 1726773025.32136: getting the remaining hosts for this loop
  8303 1726773025.32137: done getting the remaining hosts for this loop
  8303 1726773025.32139: getting the next task for host managed_node3
  8303 1726773025.32144: done getting next task for host managed_node3
  8303 1726773025.32146:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables
  8303 1726773025.32149:  ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773025.32157: getting variables
  8303 1726773025.32162: in VariableManager get_vars()
  8303 1726773025.32183: Calling all_inventory to load vars for managed_node3
  8303 1726773025.32187: Calling groups_inventory to load vars for managed_node3
  8303 1726773025.32188: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773025.32194: Calling all_plugins_play to load vars for managed_node3
  8303 1726773025.32196: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773025.32197: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773025.32231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773025.32261: done with get_vars()
  8303 1726773025.32267: done getting variables
  8303 1726773025.32343: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Set platform/version specific variables] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:31
Thursday 19 September 2024  15:10:25 -0400 (0:00:00.023)       0:00:01.902 **** 
  8303 1726773025.32368: entering _queue_task() for managed_node3/include_vars
  8303 1726773025.32369: Creating lock for include_vars
  8303 1726773025.32523: worker is 1 (out of 1 available)
  8303 1726773025.32537: exiting _queue_task() for managed_node3/include_vars
  8303 1726773025.32549: done queuing things up, now waiting for results queue to drain
  8303 1726773025.32550: waiting for pending results...
  8369 1726773025.32649: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables
  8369 1726773025.32754: in run() - task 0affffe7-6841-6cfb-81ae-00000000008d
  8369 1726773025.32769: variable 'ansible_search_path' from source: unknown
  8369 1726773025.32774: variable 'ansible_search_path' from source: unknown
  8369 1726773025.32802: calling self._execute()
  8369 1726773025.32846: variable 'ansible_host' from source: host vars for 'managed_node3'
  8369 1726773025.32854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8369 1726773025.32862: variable 'omit' from source: magic vars
  8369 1726773025.32937: variable 'omit' from source: magic vars
  8369 1726773025.32979: variable 'omit' from source: magic vars
  8369 1726773025.33230: variable 'ffparams' from source: task vars
  8369 1726773025.33321: variable 'ansible_facts' from source: unknown
  8369 1726773025.33416: variable 'ansible_facts' from source: unknown
  8369 1726773025.33476: variable 'ansible_facts' from source: unknown
  8369 1726773025.33535: variable 'ansible_facts' from source: unknown
  8369 1726773025.33589: variable 'role_path' from source: magic vars
  8369 1726773025.33717: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup
  8369 1726773025.33880: Loaded config def from plugin (lookup/first_found)
  8369 1726773025.33889: Loading LookupModule 'first_found' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/first_found.py
  8369 1726773025.33917: variable 'ansible_search_path' from source: unknown
  8369 1726773025.33936: variable 'ansible_search_path' from source: unknown
  8369 1726773025.33944: variable 'ansible_search_path' from source: unknown
  8369 1726773025.33952: variable 'ansible_search_path' from source: unknown
  8369 1726773025.33961: variable 'ansible_search_path' from source: unknown
  8369 1726773025.33978: variable 'omit' from source: magic vars
  8369 1726773025.33999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  8369 1726773025.34017: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  8369 1726773025.34032: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  8369 1726773025.34045: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8369 1726773025.34054: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8369 1726773025.34078: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  8369 1726773025.34083: variable 'ansible_host' from source: host vars for 'managed_node3'
  8369 1726773025.34089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8369 1726773025.34149: Set connection var ansible_pipelining to False
  8369 1726773025.34160: Set connection var ansible_timeout to 10
  8369 1726773025.34168: Set connection var ansible_module_compression to ZIP_DEFLATED
  8369 1726773025.34173: Set connection var ansible_shell_executable to /bin/sh
  8369 1726773025.34176: Set connection var ansible_connection to ssh
  8369 1726773025.34183: Set connection var ansible_shell_type to sh
  8369 1726773025.34201: variable 'ansible_shell_executable' from source: unknown
  8369 1726773025.34205: variable 'ansible_connection' from source: unknown
  8369 1726773025.34209: variable 'ansible_module_compression' from source: unknown
  8369 1726773025.34212: variable 'ansible_shell_type' from source: unknown
  8369 1726773025.34215: variable 'ansible_shell_executable' from source: unknown
  8369 1726773025.34218: variable 'ansible_host' from source: host vars for 'managed_node3'
  8369 1726773025.34222: variable 'ansible_pipelining' from source: unknown
  8369 1726773025.34225: variable 'ansible_timeout' from source: unknown
  8369 1726773025.34229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8369 1726773025.34295: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  8369 1726773025.34307: variable 'omit' from source: magic vars
  8369 1726773025.34313: starting attempt loop
  8369 1726773025.34317: running the handler
  8369 1726773025.34360: handler run complete
  8369 1726773025.34370: attempt loop complete, returning result
  8369 1726773025.34373: _execute() done
  8369 1726773025.34377: dumping result to json
  8369 1726773025.34381: done dumping result, returning
  8369 1726773025.34389: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables [0affffe7-6841-6cfb-81ae-00000000008d]
  8369 1726773025.34395: sending task result for task 0affffe7-6841-6cfb-81ae-00000000008d
  8369 1726773025.34420: done sending task result for task 0affffe7-6841-6cfb-81ae-00000000008d
  8369 1726773025.34423: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "ansible_facts": {
        "__kernel_settings_packages": [
            "tuned",
            "python3-configobj"
        ],
        "__kernel_settings_services": [
            "tuned"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/default.yml"
    ],
    "changed": false
}
  8303 1726773025.34570: no more pending results, returning what we have
  8303 1726773025.34573: results queue empty
  8303 1726773025.34574: checking for any_errors_fatal
  8303 1726773025.34578: done checking for any_errors_fatal
  8303 1726773025.34578: checking for max_fail_percentage
  8303 1726773025.34579: done checking for max_fail_percentage
  8303 1726773025.34580: checking to see if all hosts have failed and the running result is not ok
  8303 1726773025.34581: done checking to see if all hosts have failed
  8303 1726773025.34581: getting the remaining hosts for this loop
  8303 1726773025.34582: done getting the remaining hosts for this loop
  8303 1726773025.34587: getting the next task for host managed_node3
  8303 1726773025.34594: done getting next task for host managed_node3
  8303 1726773025.34596:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed
  8303 1726773025.34599:  ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773025.34609: getting variables
  8303 1726773025.34610: in VariableManager get_vars()
  8303 1726773025.34634: Calling all_inventory to load vars for managed_node3
  8303 1726773025.34636: Calling groups_inventory to load vars for managed_node3
  8303 1726773025.34637: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773025.34643: Calling all_plugins_play to load vars for managed_node3
  8303 1726773025.34645: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773025.34646: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773025.34680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773025.34713: done with get_vars()
  8303 1726773025.34718: done getting variables
  8303 1726773025.34784: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Ensure required packages are installed] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12
Thursday 19 September 2024  15:10:25 -0400 (0:00:00.024)       0:00:01.926 **** 
  8303 1726773025.34808: entering _queue_task() for managed_node3/package
  8303 1726773025.34810: Creating lock for package
  8303 1726773025.34967: worker is 1 (out of 1 available)
  8303 1726773025.34980: exiting _queue_task() for managed_node3/package
  8303 1726773025.34993: done queuing things up, now waiting for results queue to drain
  8303 1726773025.34994: waiting for pending results...
  8370 1726773025.35098: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed
  8370 1726773025.35197: in run() - task 0affffe7-6841-6cfb-81ae-000000000025
  8370 1726773025.35211: variable 'ansible_search_path' from source: unknown
  8370 1726773025.35215: variable 'ansible_search_path' from source: unknown
  8370 1726773025.35242: calling self._execute()
  8370 1726773025.35290: variable 'ansible_host' from source: host vars for 'managed_node3'
  8370 1726773025.35300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8370 1726773025.35309: variable 'omit' from source: magic vars
  8370 1726773025.35381: variable 'omit' from source: magic vars
  8370 1726773025.35418: variable 'omit' from source: magic vars
  8370 1726773025.35439: variable '__kernel_settings_packages' from source: include_vars
  8370 1726773025.35649: variable '__kernel_settings_packages' from source: include_vars
  8370 1726773025.35805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name
  8370 1726773025.37333: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py
  8370 1726773025.37397: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py
  8370 1726773025.37427: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py
  8370 1726773025.37453: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py
  8370 1726773025.37478: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py
  8370 1726773025.37550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  8370 1726773025.37574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  8370 1726773025.37597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  8370 1726773025.37626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  8370 1726773025.37638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  8370 1726773025.37718: variable '__kernel_settings_is_ostree' from source: set_fact
  8370 1726773025.37725: variable 'omit' from source: magic vars
  8370 1726773025.37748: variable 'omit' from source: magic vars
  8370 1726773025.37773: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  8370 1726773025.37796: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  8370 1726773025.37813: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  8370 1726773025.37828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8370 1726773025.37837: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8370 1726773025.37862: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  8370 1726773025.37867: variable 'ansible_host' from source: host vars for 'managed_node3'
  8370 1726773025.37871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8370 1726773025.37940: Set connection var ansible_pipelining to False
  8370 1726773025.37950: Set connection var ansible_timeout to 10
  8370 1726773025.37958: Set connection var ansible_module_compression to ZIP_DEFLATED
  8370 1726773025.37964: Set connection var ansible_shell_executable to /bin/sh
  8370 1726773025.37967: Set connection var ansible_connection to ssh
  8370 1726773025.37973: Set connection var ansible_shell_type to sh
  8370 1726773025.37992: variable 'ansible_shell_executable' from source: unknown
  8370 1726773025.37996: variable 'ansible_connection' from source: unknown
  8370 1726773025.38000: variable 'ansible_module_compression' from source: unknown
  8370 1726773025.38003: variable 'ansible_shell_type' from source: unknown
  8370 1726773025.38006: variable 'ansible_shell_executable' from source: unknown
  8370 1726773025.38010: variable 'ansible_host' from source: host vars for 'managed_node3'
  8370 1726773025.38012: variable 'ansible_pipelining' from source: unknown
  8370 1726773025.38014: variable 'ansible_timeout' from source: unknown
  8370 1726773025.38016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8370 1726773025.38077: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  8370 1726773025.38087: variable 'omit' from source: magic vars
  8370 1726773025.38094: starting attempt loop
  8370 1726773025.38097: running the handler
  8370 1726773025.38164: variable 'ansible_facts' from source: unknown
  8370 1726773025.38194: _low_level_execute_command(): starting
  8370 1726773025.38203: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  8370 1726773025.40592: stdout chunk (state=2):
>>>/root
<<<
  8370 1726773025.40711: stderr chunk (state=3):
>>><<<
  8370 1726773025.40719: stdout chunk (state=3):
>>><<<
  8370 1726773025.40738: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  8370 1726773025.40753: _low_level_execute_command(): starting
  8370 1726773025.40761: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773025.4074671-8370-3458756889931 `" && echo ansible-tmp-1726773025.4074671-8370-3458756889931="` echo /root/.ansible/tmp/ansible-tmp-1726773025.4074671-8370-3458756889931 `" ) && sleep 0'
  8370 1726773025.43331: stdout chunk (state=2):
>>>ansible-tmp-1726773025.4074671-8370-3458756889931=/root/.ansible/tmp/ansible-tmp-1726773025.4074671-8370-3458756889931
<<<
  8370 1726773025.43463: stderr chunk (state=3):
>>><<<
  8370 1726773025.43472: stdout chunk (state=3):
>>><<<
  8370 1726773025.43490: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773025.4074671-8370-3458756889931=/root/.ansible/tmp/ansible-tmp-1726773025.4074671-8370-3458756889931
, stderr=
  8370 1726773025.43515: variable 'ansible_module_compression' from source: unknown
  8370 1726773025.43557: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED
  8370 1726773025.43609: variable 'ansible_facts' from source: unknown
  8370 1726773025.43768: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773025.4074671-8370-3458756889931/AnsiballZ_setup.py
  8370 1726773025.43877: Sending initial data
  8370 1726773025.43884: Sent initial data (150 bytes)
  8370 1726773025.46499: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmp8khiclfg /root/.ansible/tmp/ansible-tmp-1726773025.4074671-8370-3458756889931/AnsiballZ_setup.py
<<<
  8370 1726773025.48554: stderr chunk (state=3):
>>><<<
  8370 1726773025.48565: stdout chunk (state=3):
>>><<<
  8370 1726773025.48589: done transferring module to remote
  8370 1726773025.48600: _low_level_execute_command(): starting
  8370 1726773025.48605: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773025.4074671-8370-3458756889931/ /root/.ansible/tmp/ansible-tmp-1726773025.4074671-8370-3458756889931/AnsiballZ_setup.py && sleep 0'
  8370 1726773025.51064: stderr chunk (state=2):
>>><<<
  8370 1726773025.51073: stdout chunk (state=2):
>>><<<
  8370 1726773025.51090: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8370 1726773025.51095: _low_level_execute_command(): starting
  8370 1726773025.51101: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773025.4074671-8370-3458756889931/AnsiballZ_setup.py && sleep 0'
  8370 1726773025.79964: stdout chunk (state=2):
>>>
{"ansible_facts": {"ansible_pkg_mgr": "dnf"}, "invocation": {"module_args": {"filter": ["ansible_pkg_mgr"], "gather_subset": ["!all"], "gather_timeout": 10, "fact_path": "/etc/ansible/facts.d"}}}
<<<
  8370 1726773025.81693: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  8370 1726773025.81757: stderr chunk (state=3):
>>><<<
  8370 1726773025.81766: stdout chunk (state=3):
>>><<<
  8370 1726773025.81782: _low_level_execute_command() done: rc=0, stdout=
{"ansible_facts": {"ansible_pkg_mgr": "dnf"}, "invocation": {"module_args": {"filter": ["ansible_pkg_mgr"], "gather_subset": ["!all"], "gather_timeout": 10, "fact_path": "/etc/ansible/facts.d"}}}
, stderr=Shared connection to 10.31.47.99 closed.
  8370 1726773025.81816: done with _execute_module (ansible.legacy.setup, {'filter': 'ansible_pkg_mgr', 'gather_subset': '!all', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773025.4074671-8370-3458756889931/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  8370 1726773025.81837: Facts {'ansible_facts': {'ansible_pkg_mgr': 'dnf'}, 'invocation': {'module_args': {'filter': ['ansible_pkg_mgr'], 'gather_subset': ['!all'], 'gather_timeout': 10, 'fact_path': '/etc/ansible/facts.d'}}, '_ansible_parsed': True}
  8370 1726773025.81906: variable 'ansible_module_compression' from source: unknown
  8370 1726773025.81945: ANSIBALLZ: Using generic lock for ansible.legacy.dnf
  8370 1726773025.81949: ANSIBALLZ: Acquiring lock
  8370 1726773025.81953: ANSIBALLZ: Lock acquired: 140242352720640
  8370 1726773025.81957: ANSIBALLZ: Creating module
  8370 1726773025.95601: ANSIBALLZ: Writing module into payload
  8370 1726773025.95803: ANSIBALLZ: Writing module
  8370 1726773025.95829: ANSIBALLZ: Renaming module
  8370 1726773025.95836: ANSIBALLZ: Done creating module
  8370 1726773025.95850: variable 'ansible_facts' from source: unknown
  8370 1726773025.95932: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773025.4074671-8370-3458756889931/AnsiballZ_dnf.py
  8370 1726773025.96030: Sending initial data
  8370 1726773025.96037: Sent initial data (148 bytes)
  8370 1726773025.98703: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpxexrzai5 /root/.ansible/tmp/ansible-tmp-1726773025.4074671-8370-3458756889931/AnsiballZ_dnf.py
<<<
  8370 1726773026.00458: stderr chunk (state=3):
>>><<<
  8370 1726773026.00471: stdout chunk (state=3):
>>><<<
  8370 1726773026.00498: done transferring module to remote
  8370 1726773026.00511: _low_level_execute_command(): starting
  8370 1726773026.00516: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773025.4074671-8370-3458756889931/ /root/.ansible/tmp/ansible-tmp-1726773025.4074671-8370-3458756889931/AnsiballZ_dnf.py && sleep 0'
  8370 1726773026.03353: stderr chunk (state=2):
>>><<<
  8370 1726773026.03366: stdout chunk (state=2):
>>><<<
  8370 1726773026.03383: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8370 1726773026.03389: _low_level_execute_command(): starting
  8370 1726773026.03395: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773025.4074671-8370-3458756889931/AnsiballZ_dnf.py && sleep 0'
  8370 1726773030.80621: stdout chunk (state=2):
>>>
{"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}}
<<<
  8370 1726773030.89798: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  8370 1726773030.89844: stderr chunk (state=3):
>>><<<
  8370 1726773030.89853: stdout chunk (state=3):
>>><<<
  8370 1726773030.89871: _low_level_execute_command() done: rc=0, stdout=
{"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}}
, stderr=Shared connection to 10.31.47.99 closed.
  8370 1726773030.89913: done with _execute_module (ansible.legacy.dnf, {'name': ['tuned', 'python3-configobj'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773025.4074671-8370-3458756889931/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  8370 1726773030.89922: _low_level_execute_command(): starting
  8370 1726773030.89927: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773025.4074671-8370-3458756889931/ > /dev/null 2>&1 && sleep 0'
  8370 1726773030.92666: stderr chunk (state=2):
>>><<<
  8370 1726773030.92678: stdout chunk (state=2):
>>><<<
  8370 1726773030.92696: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8370 1726773030.92704: handler run complete
  8370 1726773030.92743: attempt loop complete, returning result
  8370 1726773030.92749: _execute() done
  8370 1726773030.92753: dumping result to json
  8370 1726773030.92759: done dumping result, returning
  8370 1726773030.92767: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed [0affffe7-6841-6cfb-81ae-000000000025]
  8370 1726773030.92772: sending task result for task 0affffe7-6841-6cfb-81ae-000000000025
  8370 1726773030.92812: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000025
  8370 1726773030.92816: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "changed": false,
    "rc": 0,
    "results": []
}

MSG:

Nothing to do
  8303 1726773030.93323: no more pending results, returning what we have
  8303 1726773030.93326: results queue empty
  8303 1726773030.93327: checking for any_errors_fatal
  8303 1726773030.93332: done checking for any_errors_fatal
  8303 1726773030.93333: checking for max_fail_percentage
  8303 1726773030.93334: done checking for max_fail_percentage
  8303 1726773030.93335: checking to see if all hosts have failed and the running result is not ok
  8303 1726773030.93336: done checking to see if all hosts have failed
  8303 1726773030.93336: getting the remaining hosts for this loop
  8303 1726773030.93337: done getting the remaining hosts for this loop
  8303 1726773030.93342: getting the next task for host managed_node3
  8303 1726773030.93349: done getting next task for host managed_node3
  8303 1726773030.93353:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes
  8303 1726773030.93355:  ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773030.93364: getting variables
  8303 1726773030.93365: in VariableManager get_vars()
  8303 1726773030.93397: Calling all_inventory to load vars for managed_node3
  8303 1726773030.93400: Calling groups_inventory to load vars for managed_node3
  8303 1726773030.93402: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773030.93411: Calling all_plugins_play to load vars for managed_node3
  8303 1726773030.93413: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773030.93416: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773030.93463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773030.93503: done with get_vars()
  8303 1726773030.93511: done getting variables
  8303 1726773030.93603: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:24
Thursday 19 September 2024  15:10:30 -0400 (0:00:05.588)       0:00:07.514 **** 
  8303 1726773030.93636: entering _queue_task() for managed_node3/debug
  8303 1726773030.93638: Creating lock for debug
  8303 1726773030.93840: worker is 1 (out of 1 available)
  8303 1726773030.93851: exiting _queue_task() for managed_node3/debug
  8303 1726773030.93863: done queuing things up, now waiting for results queue to drain
  8303 1726773030.93864: waiting for pending results...
  8488 1726773030.94166: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes
  8488 1726773030.94295: in run() - task 0affffe7-6841-6cfb-81ae-000000000027
  8488 1726773030.94311: variable 'ansible_search_path' from source: unknown
  8488 1726773030.94316: variable 'ansible_search_path' from source: unknown
  8488 1726773030.94348: calling self._execute()
  8488 1726773030.94414: variable 'ansible_host' from source: host vars for 'managed_node3'
  8488 1726773030.94424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8488 1726773030.94433: variable 'omit' from source: magic vars
  8488 1726773030.94944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name
  8488 1726773030.97199: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py
  8488 1726773030.97276: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py
  8488 1726773030.97341: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py
  8488 1726773030.97377: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py
  8488 1726773030.97408: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py
  8488 1726773030.97482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  8488 1726773030.97513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  8488 1726773030.97540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  8488 1726773030.97579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  8488 1726773030.97595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  8488 1726773030.97699: variable '__kernel_settings_is_transactional' from source: set_fact
  8488 1726773030.97718: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False
  8488 1726773030.97722: when evaluation is False, skipping this task
  8488 1726773030.97725: _execute() done
  8488 1726773030.97728: dumping result to json
  8488 1726773030.97731: done dumping result, returning
  8488 1726773030.97737: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes [0affffe7-6841-6cfb-81ae-000000000027]
  8488 1726773030.97743: sending task result for task 0affffe7-6841-6cfb-81ae-000000000027
  8488 1726773030.97771: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000027
  8488 1726773030.97774: WORKER PROCESS EXITING
skipping: [managed_node3] => {
    "false_condition": "__kernel_settings_is_transactional | d(false)"
}
  8303 1726773030.98154: no more pending results, returning what we have
  8303 1726773030.98157: results queue empty
  8303 1726773030.98158: checking for any_errors_fatal
  8303 1726773030.98165: done checking for any_errors_fatal
  8303 1726773030.98165: checking for max_fail_percentage
  8303 1726773030.98167: done checking for max_fail_percentage
  8303 1726773030.98168: checking to see if all hosts have failed and the running result is not ok
  8303 1726773030.98168: done checking to see if all hosts have failed
  8303 1726773030.98169: getting the remaining hosts for this loop
  8303 1726773030.98170: done getting the remaining hosts for this loop
  8303 1726773030.98173: getting the next task for host managed_node3
  8303 1726773030.98179: done getting next task for host managed_node3
  8303 1726773030.98183:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems
  8303 1726773030.98187:  ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773030.98200: getting variables
  8303 1726773030.98202: in VariableManager get_vars()
  8303 1726773030.98235: Calling all_inventory to load vars for managed_node3
  8303 1726773030.98238: Calling groups_inventory to load vars for managed_node3
  8303 1726773030.98240: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773030.98249: Calling all_plugins_play to load vars for managed_node3
  8303 1726773030.98252: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773030.98255: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773030.98308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773030.98349: done with get_vars()
  8303 1726773030.98358: done getting variables
  8303 1726773030.98492: Loading ActionModule 'reboot' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Reboot transactional update systems] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:29
Thursday 19 September 2024  15:10:30 -0400 (0:00:00.048)       0:00:07.563 **** 
  8303 1726773030.98525: entering _queue_task() for managed_node3/reboot
  8303 1726773030.98527: Creating lock for reboot
  8303 1726773030.98774: worker is 1 (out of 1 available)
  8303 1726773030.98789: exiting _queue_task() for managed_node3/reboot
  8303 1726773030.98800: done queuing things up, now waiting for results queue to drain
  8303 1726773030.98802: waiting for pending results...
  8489 1726773030.98997: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems
  8489 1726773030.99124: in run() - task 0affffe7-6841-6cfb-81ae-000000000028
  8489 1726773030.99141: variable 'ansible_search_path' from source: unknown
  8489 1726773030.99145: variable 'ansible_search_path' from source: unknown
  8489 1726773030.99177: calling self._execute()
  8489 1726773030.99236: variable 'ansible_host' from source: host vars for 'managed_node3'
  8489 1726773030.99245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8489 1726773030.99254: variable 'omit' from source: magic vars
  8489 1726773030.99680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name
  8489 1726773031.01728: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py
  8489 1726773031.01778: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py
  8489 1726773031.01809: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py
  8489 1726773031.01838: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py
  8489 1726773031.01859: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py
  8489 1726773031.01918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  8489 1726773031.01944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  8489 1726773031.01963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  8489 1726773031.01994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  8489 1726773031.02006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  8489 1726773031.02089: variable '__kernel_settings_is_transactional' from source: set_fact
  8489 1726773031.02107: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False
  8489 1726773031.02112: when evaluation is False, skipping this task
  8489 1726773031.02115: _execute() done
  8489 1726773031.02119: dumping result to json
  8489 1726773031.02123: done dumping result, returning
  8489 1726773031.02130: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems [0affffe7-6841-6cfb-81ae-000000000028]
  8489 1726773031.02136: sending task result for task 0affffe7-6841-6cfb-81ae-000000000028
  8489 1726773031.02162: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000028
  8489 1726773031.02169: WORKER PROCESS EXITING
skipping: [managed_node3] => {
    "changed": false,
    "false_condition": "__kernel_settings_is_transactional | d(false)",
    "skip_reason": "Conditional result was False"
}
  8303 1726773031.02326: no more pending results, returning what we have
  8303 1726773031.02329: results queue empty
  8303 1726773031.02329: checking for any_errors_fatal
  8303 1726773031.02333: done checking for any_errors_fatal
  8303 1726773031.02334: checking for max_fail_percentage
  8303 1726773031.02336: done checking for max_fail_percentage
  8303 1726773031.02337: checking to see if all hosts have failed and the running result is not ok
  8303 1726773031.02337: done checking to see if all hosts have failed
  8303 1726773031.02338: getting the remaining hosts for this loop
  8303 1726773031.02339: done getting the remaining hosts for this loop
  8303 1726773031.02342: getting the next task for host managed_node3
  8303 1726773031.02347: done getting next task for host managed_node3
  8303 1726773031.02350:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set
  8303 1726773031.02352:  ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773031.02363: getting variables
  8303 1726773031.02365: in VariableManager get_vars()
  8303 1726773031.02394: Calling all_inventory to load vars for managed_node3
  8303 1726773031.02397: Calling groups_inventory to load vars for managed_node3
  8303 1726773031.02399: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773031.02406: Calling all_plugins_play to load vars for managed_node3
  8303 1726773031.02407: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773031.02409: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773031.02443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773031.02470: done with get_vars()
  8303 1726773031.02476: done getting variables
  8303 1726773031.02519: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:34
Thursday 19 September 2024  15:10:31 -0400 (0:00:00.040)       0:00:07.604 **** 
  8303 1726773031.02542: entering _queue_task() for managed_node3/fail
  8303 1726773031.02707: worker is 1 (out of 1 available)
  8303 1726773031.02721: exiting _queue_task() for managed_node3/fail
  8303 1726773031.02733: done queuing things up, now waiting for results queue to drain
  8303 1726773031.02734: waiting for pending results...
  8492 1726773031.02855: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set
  8492 1726773031.02955: in run() - task 0affffe7-6841-6cfb-81ae-000000000029
  8492 1726773031.02972: variable 'ansible_search_path' from source: unknown
  8492 1726773031.02977: variable 'ansible_search_path' from source: unknown
  8492 1726773031.03006: calling self._execute()
  8492 1726773031.03057: variable 'ansible_host' from source: host vars for 'managed_node3'
  8492 1726773031.03067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8492 1726773031.03077: variable 'omit' from source: magic vars
  8492 1726773031.03445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name
  8492 1726773031.05643: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py
  8492 1726773031.05695: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py
  8492 1726773031.05724: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py
  8492 1726773031.05747: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py
  8492 1726773031.05774: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py
  8492 1726773031.05842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  8492 1726773031.05872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  8492 1726773031.05899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  8492 1726773031.05928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  8492 1726773031.05938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  8492 1726773031.06024: variable '__kernel_settings_is_transactional' from source: set_fact
  8492 1726773031.06040: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False
  8492 1726773031.06044: when evaluation is False, skipping this task
  8492 1726773031.06047: _execute() done
  8492 1726773031.06052: dumping result to json
  8492 1726773031.06057: done dumping result, returning
  8492 1726773031.06063: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set [0affffe7-6841-6cfb-81ae-000000000029]
  8492 1726773031.06072: sending task result for task 0affffe7-6841-6cfb-81ae-000000000029
  8492 1726773031.06205: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000029
  8492 1726773031.06209: WORKER PROCESS EXITING
skipping: [managed_node3] => {
    "changed": false,
    "false_condition": "__kernel_settings_is_transactional | d(false)",
    "skip_reason": "Conditional result was False"
}
  8303 1726773031.06542: no more pending results, returning what we have
  8303 1726773031.06544: results queue empty
  8303 1726773031.06545: checking for any_errors_fatal
  8303 1726773031.06549: done checking for any_errors_fatal
  8303 1726773031.06550: checking for max_fail_percentage
  8303 1726773031.06551: done checking for max_fail_percentage
  8303 1726773031.06552: checking to see if all hosts have failed and the running result is not ok
  8303 1726773031.06553: done checking to see if all hosts have failed
  8303 1726773031.06553: getting the remaining hosts for this loop
  8303 1726773031.06554: done getting the remaining hosts for this loop
  8303 1726773031.06557: getting the next task for host managed_node3
  8303 1726773031.06564: done getting next task for host managed_node3
  8303 1726773031.06568:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config
  8303 1726773031.06570:  ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773031.06582: getting variables
  8303 1726773031.06584: in VariableManager get_vars()
  8303 1726773031.06618: Calling all_inventory to load vars for managed_node3
  8303 1726773031.06621: Calling groups_inventory to load vars for managed_node3
  8303 1726773031.06623: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773031.06632: Calling all_plugins_play to load vars for managed_node3
  8303 1726773031.06635: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773031.06638: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773031.06688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773031.06719: done with get_vars()
  8303 1726773031.06725: done getting variables

TASK [fedora.linux_system_roles.kernel_settings : Read tuned main config] ******
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42
Thursday 19 September 2024  15:10:31 -0400 (0:00:00.042)       0:00:07.646 **** 
  8303 1726773031.06783: entering _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config
  8303 1726773031.06784: Creating lock for fedora.linux_system_roles.kernel_settings_get_config
  8303 1726773031.06952: worker is 1 (out of 1 available)
  8303 1726773031.06965: exiting _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config
  8303 1726773031.06977: done queuing things up, now waiting for results queue to drain
  8303 1726773031.06979: waiting for pending results...
  8495 1726773031.07093: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config
  8495 1726773031.07192: in run() - task 0affffe7-6841-6cfb-81ae-00000000002b
  8495 1726773031.07209: variable 'ansible_search_path' from source: unknown
  8495 1726773031.07213: variable 'ansible_search_path' from source: unknown
  8495 1726773031.07242: calling self._execute()
  8495 1726773031.07292: variable 'ansible_host' from source: host vars for 'managed_node3'
  8495 1726773031.07302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8495 1726773031.07310: variable 'omit' from source: magic vars
  8495 1726773031.07410: variable 'omit' from source: magic vars
  8495 1726773031.07455: variable 'omit' from source: magic vars
  8495 1726773031.07486: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars
  8495 1726773031.07752: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars
  8495 1726773031.07830: variable '__kernel_settings_tuned_dir' from source: role '' all vars
  8495 1726773031.07866: variable 'omit' from source: magic vars
  8495 1726773031.07905: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  8495 1726773031.07939: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  8495 1726773031.07960: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  8495 1726773031.07979: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8495 1726773031.08037: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8495 1726773031.08069: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  8495 1726773031.08075: variable 'ansible_host' from source: host vars for 'managed_node3'
  8495 1726773031.08079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8495 1726773031.08178: Set connection var ansible_pipelining to False
  8495 1726773031.08191: Set connection var ansible_timeout to 10
  8495 1726773031.08198: Set connection var ansible_module_compression to ZIP_DEFLATED
  8495 1726773031.08204: Set connection var ansible_shell_executable to /bin/sh
  8495 1726773031.08207: Set connection var ansible_connection to ssh
  8495 1726773031.08215: Set connection var ansible_shell_type to sh
  8495 1726773031.08234: variable 'ansible_shell_executable' from source: unknown
  8495 1726773031.08238: variable 'ansible_connection' from source: unknown
  8495 1726773031.08241: variable 'ansible_module_compression' from source: unknown
  8495 1726773031.08244: variable 'ansible_shell_type' from source: unknown
  8495 1726773031.08246: variable 'ansible_shell_executable' from source: unknown
  8495 1726773031.08249: variable 'ansible_host' from source: host vars for 'managed_node3'
  8495 1726773031.08252: variable 'ansible_pipelining' from source: unknown
  8495 1726773031.08255: variable 'ansible_timeout' from source: unknown
  8495 1726773031.08258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8495 1726773031.08426: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action)
  8495 1726773031.08438: variable 'omit' from source: magic vars
  8495 1726773031.08443: starting attempt loop
  8495 1726773031.08446: running the handler
  8495 1726773031.08458: _low_level_execute_command(): starting
  8495 1726773031.08467: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  8495 1726773031.11208: stdout chunk (state=2):
>>>/root
<<<
  8495 1726773031.11348: stderr chunk (state=3):
>>><<<
  8495 1726773031.11357: stdout chunk (state=3):
>>><<<
  8495 1726773031.11380: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  8495 1726773031.11398: _low_level_execute_command(): starting
  8495 1726773031.11407: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773031.1139154-8495-59394965775304 `" && echo ansible-tmp-1726773031.1139154-8495-59394965775304="` echo /root/.ansible/tmp/ansible-tmp-1726773031.1139154-8495-59394965775304 `" ) && sleep 0'
  8495 1726773031.14616: stdout chunk (state=2):
>>>ansible-tmp-1726773031.1139154-8495-59394965775304=/root/.ansible/tmp/ansible-tmp-1726773031.1139154-8495-59394965775304
<<<
  8495 1726773031.14849: stderr chunk (state=3):
>>><<<
  8495 1726773031.14858: stdout chunk (state=3):
>>><<<
  8495 1726773031.14880: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773031.1139154-8495-59394965775304=/root/.ansible/tmp/ansible-tmp-1726773031.1139154-8495-59394965775304
, stderr=
  8495 1726773031.14921: variable 'ansible_module_compression' from source: unknown
  8495 1726773031.14954: ANSIBALLZ: Using lock for fedora.linux_system_roles.kernel_settings_get_config
  8495 1726773031.14966: ANSIBALLZ: Acquiring lock
  8495 1726773031.14970: ANSIBALLZ: Lock acquired: 140242350837536
  8495 1726773031.14977: ANSIBALLZ: Creating module
  8495 1726773031.27383: ANSIBALLZ: Writing module into payload
  8495 1726773031.27474: ANSIBALLZ: Writing module
  8495 1726773031.27502: ANSIBALLZ: Renaming module
  8495 1726773031.27509: ANSIBALLZ: Done creating module
  8495 1726773031.27529: variable 'ansible_facts' from source: unknown
  8495 1726773031.27622: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773031.1139154-8495-59394965775304/AnsiballZ_kernel_settings_get_config.py
  8495 1726773031.28128: Sending initial data
  8495 1726773031.28135: Sent initial data (172 bytes)
  8495 1726773031.31147: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpvqad1o_k /root/.ansible/tmp/ansible-tmp-1726773031.1139154-8495-59394965775304/AnsiballZ_kernel_settings_get_config.py
<<<
  8495 1726773031.32991: stderr chunk (state=3):
>>><<<
  8495 1726773031.33002: stdout chunk (state=3):
>>><<<
  8495 1726773031.33024: done transferring module to remote
  8495 1726773031.33035: _low_level_execute_command(): starting
  8495 1726773031.33040: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773031.1139154-8495-59394965775304/ /root/.ansible/tmp/ansible-tmp-1726773031.1139154-8495-59394965775304/AnsiballZ_kernel_settings_get_config.py && sleep 0'
  8495 1726773031.35724: stderr chunk (state=2):
>>><<<
  8495 1726773031.35738: stdout chunk (state=2):
>>><<<
  8495 1726773031.35755: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8495 1726773031.35759: _low_level_execute_command(): starting
  8495 1726773031.35767: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773031.1139154-8495-59394965775304/AnsiballZ_kernel_settings_get_config.py && sleep 0'
  8495 1726773031.52094: stdout chunk (state=2):
>>>
{"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}}
<<<
  8495 1726773031.53298: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  8495 1726773031.53345: stderr chunk (state=3):
>>><<<
  8495 1726773031.53355: stdout chunk (state=3):
>>><<<
  8495 1726773031.53377: _low_level_execute_command() done: rc=0, stdout=
{"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}}
, stderr=Shared connection to 10.31.47.99 closed.
  8495 1726773031.53411: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/tuned-main.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773031.1139154-8495-59394965775304/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  8495 1726773031.53423: _low_level_execute_command(): starting
  8495 1726773031.53429: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773031.1139154-8495-59394965775304/ > /dev/null 2>&1 && sleep 0'
  8495 1726773031.56215: stderr chunk (state=2):
>>><<<
  8495 1726773031.56226: stdout chunk (state=2):
>>><<<
  8495 1726773031.56242: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8495 1726773031.56250: handler run complete
  8495 1726773031.56272: attempt loop complete, returning result
  8495 1726773031.56278: _execute() done
  8495 1726773031.56281: dumping result to json
  8495 1726773031.56288: done dumping result, returning
  8495 1726773031.56297: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config [0affffe7-6841-6cfb-81ae-00000000002b]
  8495 1726773031.56302: sending task result for task 0affffe7-6841-6cfb-81ae-00000000002b
  8495 1726773031.56339: done sending task result for task 0affffe7-6841-6cfb-81ae-00000000002b
  8495 1726773031.56343: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "changed": false,
    "data": {
        "daemon": "1",
        "default_instance_priority": "0",
        "dynamic_tuning": "0",
        "log_file_count": "2",
        "log_file_max_size": "1MB",
        "reapply_sysctl": "1",
        "recommend_command": "1",
        "sleep_interval": "1",
        "udev_buffer_size": "1MB",
        "update_interval": "10"
    }
}
  8303 1726773031.56799: no more pending results, returning what we have
  8303 1726773031.56802: results queue empty
  8303 1726773031.56803: checking for any_errors_fatal
  8303 1726773031.56807: done checking for any_errors_fatal
  8303 1726773031.56808: checking for max_fail_percentage
  8303 1726773031.56809: done checking for max_fail_percentage
  8303 1726773031.56809: checking to see if all hosts have failed and the running result is not ok
  8303 1726773031.56810: done checking to see if all hosts have failed
  8303 1726773031.56810: getting the remaining hosts for this loop
  8303 1726773031.56811: done getting the remaining hosts for this loop
  8303 1726773031.56815: getting the next task for host managed_node3
  8303 1726773031.56820: done getting next task for host managed_node3
  8303 1726773031.56823:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory
  8303 1726773031.56826:  ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773031.56837: getting variables
  8303 1726773031.56838: in VariableManager get_vars()
  8303 1726773031.56872: Calling all_inventory to load vars for managed_node3
  8303 1726773031.56875: Calling groups_inventory to load vars for managed_node3
  8303 1726773031.56877: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773031.56887: Calling all_plugins_play to load vars for managed_node3
  8303 1726773031.56890: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773031.56893: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773031.56943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773031.56988: done with get_vars()
  8303 1726773031.56995: done getting variables

TASK [fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50
Thursday 19 September 2024  15:10:31 -0400 (0:00:00.502)       0:00:08.149 **** 
  8303 1726773031.57081: entering _queue_task() for managed_node3/stat
  8303 1726773031.57278: worker is 1 (out of 1 available)
  8303 1726773031.57293: exiting _queue_task() for managed_node3/stat
  8303 1726773031.57304: done queuing things up, now waiting for results queue to drain
  8303 1726773031.57305: waiting for pending results...
  8523 1726773031.57504: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory
  8523 1726773031.57628: in run() - task 0affffe7-6841-6cfb-81ae-00000000002c
  8523 1726773031.57645: variable 'ansible_search_path' from source: unknown
  8523 1726773031.57649: variable 'ansible_search_path' from source: unknown
  8523 1726773031.57696: variable '__prof_from_conf' from source: task vars
  8523 1726773031.57996: variable '__prof_from_conf' from source: task vars
  8523 1726773031.58188: variable '__data' from source: task vars
  8523 1726773031.58259: variable '__kernel_settings_register_tuned_main' from source: set_fact
  8523 1726773031.60105: variable '__kernel_settings_tuned_dir' from source: role '' all vars
  8523 1726773031.60122: variable '__kernel_settings_tuned_dir' from source: role '' all vars
  8523 1726773031.60192: variable '__kernel_settings_tuned_dir' from source: role '' all vars
  8523 1726773031.60278: variable 'omit' from source: magic vars
  8523 1726773031.60382: variable 'ansible_host' from source: host vars for 'managed_node3'
  8523 1726773031.60395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8523 1726773031.60406: variable 'omit' from source: magic vars
  8523 1726773031.60658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name
  8523 1726773031.62972: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py
  8523 1726773031.63036: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py
  8523 1726773031.63076: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py
  8523 1726773031.63110: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py
  8523 1726773031.63136: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py
  8523 1726773031.63210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  8523 1726773031.63252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  8523 1726773031.63281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  8523 1726773031.63325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  8523 1726773031.63342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  8523 1726773031.63440: variable 'item' from source: unknown
  8523 1726773031.63456: Evaluated conditional (item | length > 0): False
  8523 1726773031.63461: when evaluation is False, skipping this task
  8523 1726773031.63497: variable 'item' from source: unknown
  8523 1726773031.63569: variable 'item' from source: unknown
skipping: [managed_node3] => (item=)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "false_condition": "item | length > 0",
    "item": "",
    "skip_reason": "Conditional result was False"
}
  8523 1726773031.63654: variable 'ansible_host' from source: host vars for 'managed_node3'
  8523 1726773031.63667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8523 1726773031.63677: variable 'omit' from source: magic vars
  8523 1726773031.63822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  8523 1726773031.63846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  8523 1726773031.64067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  8523 1726773031.64109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  8523 1726773031.64125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  8523 1726773031.64201: variable 'item' from source: unknown
  8523 1726773031.64211: Evaluated conditional (item | length > 0): True
  8523 1726773031.64217: variable 'omit' from source: magic vars
  8523 1726773031.64254: variable 'omit' from source: magic vars
  8523 1726773031.64471: variable 'item' from source: unknown
  8523 1726773031.64534: variable 'item' from source: unknown
  8523 1726773031.64553: variable 'omit' from source: magic vars
  8523 1726773031.64594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  8523 1726773031.64621: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  8523 1726773031.64639: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  8523 1726773031.64657: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8523 1726773031.64670: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8523 1726773031.64700: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  8523 1726773031.64706: variable 'ansible_host' from source: host vars for 'managed_node3'
  8523 1726773031.64711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8523 1726773031.64811: Set connection var ansible_pipelining to False
  8523 1726773031.64822: Set connection var ansible_timeout to 10
  8523 1726773031.64829: Set connection var ansible_module_compression to ZIP_DEFLATED
  8523 1726773031.64836: Set connection var ansible_shell_executable to /bin/sh
  8523 1726773031.64839: Set connection var ansible_connection to ssh
  8523 1726773031.64847: Set connection var ansible_shell_type to sh
  8523 1726773031.64869: variable 'ansible_shell_executable' from source: unknown
  8523 1726773031.64874: variable 'ansible_connection' from source: unknown
  8523 1726773031.64878: variable 'ansible_module_compression' from source: unknown
  8523 1726773031.64881: variable 'ansible_shell_type' from source: unknown
  8523 1726773031.64884: variable 'ansible_shell_executable' from source: unknown
  8523 1726773031.64890: variable 'ansible_host' from source: host vars for 'managed_node3'
  8523 1726773031.64894: variable 'ansible_pipelining' from source: unknown
  8523 1726773031.64897: variable 'ansible_timeout' from source: unknown
  8523 1726773031.64901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8523 1726773031.65035: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action)
  8523 1726773031.65046: variable 'omit' from source: magic vars
  8523 1726773031.65052: starting attempt loop
  8523 1726773031.65055: running the handler
  8523 1726773031.65069: _low_level_execute_command(): starting
  8523 1726773031.65077: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  8523 1726773031.70094: stdout chunk (state=2):
>>>/root
<<<
  8523 1726773031.70108: stderr chunk (state=2):
>>><<<
  8523 1726773031.70123: stdout chunk (state=3):
>>><<<
  8523 1726773031.70141: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  8523 1726773031.70158: _low_level_execute_command(): starting
  8523 1726773031.70165: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773031.7014973-8523-183023649450147 `" && echo ansible-tmp-1726773031.7014973-8523-183023649450147="` echo /root/.ansible/tmp/ansible-tmp-1726773031.7014973-8523-183023649450147 `" ) && sleep 0'
  8523 1726773031.73297: stdout chunk (state=2):
>>>ansible-tmp-1726773031.7014973-8523-183023649450147=/root/.ansible/tmp/ansible-tmp-1726773031.7014973-8523-183023649450147
<<<
  8523 1726773031.73618: stderr chunk (state=3):
>>><<<
  8523 1726773031.73626: stdout chunk (state=3):
>>><<<
  8523 1726773031.73642: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773031.7014973-8523-183023649450147=/root/.ansible/tmp/ansible-tmp-1726773031.7014973-8523-183023649450147
, stderr=
  8523 1726773031.73679: variable 'ansible_module_compression' from source: unknown
  8523 1726773031.73724: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED
  8523 1726773031.73754: variable 'ansible_facts' from source: unknown
  8523 1726773031.73830: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773031.7014973-8523-183023649450147/AnsiballZ_stat.py
  8523 1726773031.74290: Sending initial data
  8523 1726773031.74297: Sent initial data (151 bytes)
  8523 1726773031.77371: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpqy9zgew1 /root/.ansible/tmp/ansible-tmp-1726773031.7014973-8523-183023649450147/AnsiballZ_stat.py
<<<
  8523 1726773031.78834: stderr chunk (state=3):
>>><<<
  8523 1726773031.78847: stdout chunk (state=3):
>>><<<
  8523 1726773031.78867: done transferring module to remote
  8523 1726773031.78878: _low_level_execute_command(): starting
  8523 1726773031.78883: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773031.7014973-8523-183023649450147/ /root/.ansible/tmp/ansible-tmp-1726773031.7014973-8523-183023649450147/AnsiballZ_stat.py && sleep 0'
  8523 1726773031.81338: stderr chunk (state=2):
>>><<<
  8523 1726773031.81347: stdout chunk (state=2):
>>><<<
  8523 1726773031.81367: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8523 1726773031.81376: _low_level_execute_command(): starting
  8523 1726773031.81384: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773031.7014973-8523-183023649450147/AnsiballZ_stat.py && sleep 0'
  8523 1726773031.96668: stdout chunk (state=2):
>>>
{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}}
<<<
  8523 1726773031.97736: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  8523 1726773031.97787: stderr chunk (state=3):
>>><<<
  8523 1726773031.97794: stdout chunk (state=3):
>>><<<
  8523 1726773031.97809: _low_level_execute_command() done: rc=0, stdout=
{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}}
, stderr=Shared connection to 10.31.47.99 closed.
  8523 1726773031.97834: done with _execute_module (stat, {'path': '/etc/tuned/profiles', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773031.7014973-8523-183023649450147/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  8523 1726773031.97844: _low_level_execute_command(): starting
  8523 1726773031.97850: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773031.7014973-8523-183023649450147/ > /dev/null 2>&1 && sleep 0'
  8523 1726773032.00324: stderr chunk (state=2):
>>><<<
  8523 1726773032.00334: stdout chunk (state=2):
>>><<<
  8523 1726773032.00353: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8523 1726773032.00360: handler run complete
  8523 1726773032.00377: attempt loop complete, returning result
  8523 1726773032.00395: variable 'item' from source: unknown
  8523 1726773032.00466: variable 'item' from source: unknown
ok: [managed_node3] => (item=/etc/tuned/profiles) => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "/etc/tuned/profiles",
    "stat": {
        "exists": false
    }
}
  8523 1726773032.00553: variable 'ansible_host' from source: host vars for 'managed_node3'
  8523 1726773032.00563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8523 1726773032.00576: variable 'omit' from source: magic vars
  8523 1726773032.00697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  8523 1726773032.00717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  8523 1726773032.00740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  8523 1726773032.00770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  8523 1726773032.00781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  8523 1726773032.00839: variable 'item' from source: unknown
  8523 1726773032.00848: Evaluated conditional (item | length > 0): True
  8523 1726773032.00854: variable 'omit' from source: magic vars
  8523 1726773032.00867: variable 'omit' from source: magic vars
  8523 1726773032.00897: variable 'item' from source: unknown
  8523 1726773032.00941: variable 'item' from source: unknown
  8523 1726773032.00956: variable 'omit' from source: magic vars
  8523 1726773032.00974: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  8523 1726773032.00983: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8523 1726773032.00994: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8523 1726773032.01006: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  8523 1726773032.01011: variable 'ansible_host' from source: host vars for 'managed_node3'
  8523 1726773032.01015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8523 1726773032.01064: Set connection var ansible_pipelining to False
  8523 1726773032.01075: Set connection var ansible_timeout to 10
  8523 1726773032.01081: Set connection var ansible_module_compression to ZIP_DEFLATED
  8523 1726773032.01087: Set connection var ansible_shell_executable to /bin/sh
  8523 1726773032.01090: Set connection var ansible_connection to ssh
  8523 1726773032.01097: Set connection var ansible_shell_type to sh
  8523 1726773032.01110: variable 'ansible_shell_executable' from source: unknown
  8523 1726773032.01114: variable 'ansible_connection' from source: unknown
  8523 1726773032.01117: variable 'ansible_module_compression' from source: unknown
  8523 1726773032.01120: variable 'ansible_shell_type' from source: unknown
  8523 1726773032.01124: variable 'ansible_shell_executable' from source: unknown
  8523 1726773032.01127: variable 'ansible_host' from source: host vars for 'managed_node3'
  8523 1726773032.01131: variable 'ansible_pipelining' from source: unknown
  8523 1726773032.01134: variable 'ansible_timeout' from source: unknown
  8523 1726773032.01139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8523 1726773032.01210: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  8523 1726773032.01221: variable 'omit' from source: magic vars
  8523 1726773032.01226: starting attempt loop
  8523 1726773032.01230: running the handler
  8523 1726773032.01237: _low_level_execute_command(): starting
  8523 1726773032.01241: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  8523 1726773032.03532: stdout chunk (state=2):
>>>/root
<<<
  8523 1726773032.03663: stderr chunk (state=3):
>>><<<
  8523 1726773032.03672: stdout chunk (state=3):
>>><<<
  8523 1726773032.03690: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  8523 1726773032.03700: _low_level_execute_command(): starting
  8523 1726773032.03705: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773032.0369675-8523-153317084280855 `" && echo ansible-tmp-1726773032.0369675-8523-153317084280855="` echo /root/.ansible/tmp/ansible-tmp-1726773032.0369675-8523-153317084280855 `" ) && sleep 0'
  8523 1726773032.06232: stdout chunk (state=2):
>>>ansible-tmp-1726773032.0369675-8523-153317084280855=/root/.ansible/tmp/ansible-tmp-1726773032.0369675-8523-153317084280855
<<<
  8523 1726773032.06358: stderr chunk (state=3):
>>><<<
  8523 1726773032.06366: stdout chunk (state=3):
>>><<<
  8523 1726773032.06382: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773032.0369675-8523-153317084280855=/root/.ansible/tmp/ansible-tmp-1726773032.0369675-8523-153317084280855
, stderr=
  8523 1726773032.06414: variable 'ansible_module_compression' from source: unknown
  8523 1726773032.06451: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED
  8523 1726773032.06470: variable 'ansible_facts' from source: unknown
  8523 1726773032.06524: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773032.0369675-8523-153317084280855/AnsiballZ_stat.py
  8523 1726773032.06614: Sending initial data
  8523 1726773032.06622: Sent initial data (151 bytes)
  8523 1726773032.09693: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpcodijo1v /root/.ansible/tmp/ansible-tmp-1726773032.0369675-8523-153317084280855/AnsiballZ_stat.py
<<<
  8523 1726773032.11097: stderr chunk (state=3):
>>><<<
  8523 1726773032.11108: stdout chunk (state=3):
>>><<<
  8523 1726773032.11132: done transferring module to remote
  8523 1726773032.11145: _low_level_execute_command(): starting
  8523 1726773032.11150: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773032.0369675-8523-153317084280855/ /root/.ansible/tmp/ansible-tmp-1726773032.0369675-8523-153317084280855/AnsiballZ_stat.py && sleep 0'
  8523 1726773032.14372: stderr chunk (state=2):
>>><<<
  8523 1726773032.14384: stdout chunk (state=2):
>>><<<
  8523 1726773032.14404: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8523 1726773032.14409: _low_level_execute_command(): starting
  8523 1726773032.14414: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773032.0369675-8523-153317084280855/AnsiballZ_stat.py && sleep 0'
  8523 1726773032.31247: stdout chunk (state=2):
>>>
{"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 136, "inode": 917919, "dev": 51713, "nlink": 3, "atime": 1726772618.1734717, "mtime": 1716968741.377, "ctime": 1716968741.377, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}}
<<<
  8523 1726773032.32413: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  8523 1726773032.32462: stderr chunk (state=3):
>>><<<
  8523 1726773032.32471: stdout chunk (state=3):
>>><<<
  8523 1726773032.32490: _low_level_execute_command() done: rc=0, stdout=
{"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 136, "inode": 917919, "dev": 51713, "nlink": 3, "atime": 1726772618.1734717, "mtime": 1716968741.377, "ctime": 1716968741.377, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}}
, stderr=Shared connection to 10.31.47.99 closed.
  8523 1726773032.32525: done with _execute_module (stat, {'path': '/etc/tuned', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773032.0369675-8523-153317084280855/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  8523 1726773032.32533: _low_level_execute_command(): starting
  8523 1726773032.32539: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773032.0369675-8523-153317084280855/ > /dev/null 2>&1 && sleep 0'
  8523 1726773032.35031: stderr chunk (state=2):
>>><<<
  8523 1726773032.35040: stdout chunk (state=2):
>>><<<
  8523 1726773032.35056: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8523 1726773032.35063: handler run complete
  8523 1726773032.35100: attempt loop complete, returning result
  8523 1726773032.35117: variable 'item' from source: unknown
  8523 1726773032.35179: variable 'item' from source: unknown
ok: [managed_node3] => (item=/etc/tuned) => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "/etc/tuned",
    "stat": {
        "atime": 1726772618.1734717,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 0,
        "charset": "binary",
        "ctime": 1716968741.377,
        "dev": 51713,
        "device_type": 0,
        "executable": true,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 917919,
        "isblk": false,
        "ischr": false,
        "isdir": true,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": false,
        "issock": false,
        "isuid": false,
        "mimetype": "inode/directory",
        "mode": "0755",
        "mtime": 1716968741.377,
        "nlink": 3,
        "path": "/etc/tuned",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": true,
        "rusr": true,
        "size": 136,
        "uid": 0,
        "version": "1785990601",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": true,
        "xoth": true,
        "xusr": true
    }
}
  8523 1726773032.35225: dumping result to json
  8523 1726773032.35235: done dumping result, returning
  8523 1726773032.35243: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory [0affffe7-6841-6cfb-81ae-00000000002c]
  8523 1726773032.35249: sending task result for task 0affffe7-6841-6cfb-81ae-00000000002c
  8523 1726773032.35292: done sending task result for task 0affffe7-6841-6cfb-81ae-00000000002c
  8523 1726773032.35296: WORKER PROCESS EXITING
  8303 1726773032.35525: no more pending results, returning what we have
  8303 1726773032.35527: results queue empty
  8303 1726773032.35528: checking for any_errors_fatal
  8303 1726773032.35531: done checking for any_errors_fatal
  8303 1726773032.35532: checking for max_fail_percentage
  8303 1726773032.35533: done checking for max_fail_percentage
  8303 1726773032.35533: checking to see if all hosts have failed and the running result is not ok
  8303 1726773032.35534: done checking to see if all hosts have failed
  8303 1726773032.35534: getting the remaining hosts for this loop
  8303 1726773032.35536: done getting the remaining hosts for this loop
  8303 1726773032.35539: getting the next task for host managed_node3
  8303 1726773032.35543: done getting next task for host managed_node3
  8303 1726773032.35546:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir
  8303 1726773032.35548:  ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773032.35558: getting variables
  8303 1726773032.35559: in VariableManager get_vars()
  8303 1726773032.35578: Calling all_inventory to load vars for managed_node3
  8303 1726773032.35580: Calling groups_inventory to load vars for managed_node3
  8303 1726773032.35581: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773032.35589: Calling all_plugins_play to load vars for managed_node3
  8303 1726773032.35591: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773032.35593: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773032.35626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773032.35649: done with get_vars()
  8303 1726773032.35654: done getting variables
  8303 1726773032.35697: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:63
Thursday 19 September 2024  15:10:32 -0400 (0:00:00.786)       0:00:08.935 **** 
  8303 1726773032.35719: entering _queue_task() for managed_node3/set_fact
  8303 1726773032.35877: worker is 1 (out of 1 available)
  8303 1726773032.35891: exiting _queue_task() for managed_node3/set_fact
  8303 1726773032.35903: done queuing things up, now waiting for results queue to drain
  8303 1726773032.35904: waiting for pending results...
  8564 1726773032.36011: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir
  8564 1726773032.36109: in run() - task 0affffe7-6841-6cfb-81ae-00000000002d
  8564 1726773032.36124: variable 'ansible_search_path' from source: unknown
  8564 1726773032.36128: variable 'ansible_search_path' from source: unknown
  8564 1726773032.36155: calling self._execute()
  8564 1726773032.36206: variable 'ansible_host' from source: host vars for 'managed_node3'
  8564 1726773032.36215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8564 1726773032.36224: variable 'omit' from source: magic vars
  8564 1726773032.36301: variable 'omit' from source: magic vars
  8564 1726773032.36337: variable 'omit' from source: magic vars
  8564 1726773032.36655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name
  8564 1726773032.38173: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py
  8564 1726773032.38223: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py
  8564 1726773032.38251: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py
  8564 1726773032.38426: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py
  8564 1726773032.38447: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py
  8564 1726773032.38509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  8564 1726773032.38530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  8564 1726773032.38549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  8564 1726773032.38579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  8564 1726773032.38594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  8564 1726773032.38628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  8564 1726773032.38645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  8564 1726773032.38663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  8564 1726773032.38695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  8564 1726773032.38707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  8564 1726773032.38748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  8564 1726773032.38768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  8564 1726773032.38788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  8564 1726773032.38816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  8564 1726773032.38828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  8564 1726773032.38979: variable '__kernel_settings_find_profile_dirs' from source: set_fact
  8564 1726773032.39049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name
  8564 1726773032.39166: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py
  8564 1726773032.39196: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py
  8564 1726773032.39219: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py
  8564 1726773032.39245: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py
  8564 1726773032.39278: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False)
  8564 1726773032.39298: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False)
  8564 1726773032.39315: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False)
  8564 1726773032.39334: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False)
  8564 1726773032.39376: variable 'omit' from source: magic vars
  8564 1726773032.39399: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  8564 1726773032.39419: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  8564 1726773032.39434: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  8564 1726773032.39448: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8564 1726773032.39457: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8564 1726773032.39481: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  8564 1726773032.39493: variable 'ansible_host' from source: host vars for 'managed_node3'
  8564 1726773032.39498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8564 1726773032.39568: Set connection var ansible_pipelining to False
  8564 1726773032.39578: Set connection var ansible_timeout to 10
  8564 1726773032.39584: Set connection var ansible_module_compression to ZIP_DEFLATED
  8564 1726773032.39589: Set connection var ansible_shell_executable to /bin/sh
  8564 1726773032.39592: Set connection var ansible_connection to ssh
  8564 1726773032.39599: Set connection var ansible_shell_type to sh
  8564 1726773032.39617: variable 'ansible_shell_executable' from source: unknown
  8564 1726773032.39621: variable 'ansible_connection' from source: unknown
  8564 1726773032.39625: variable 'ansible_module_compression' from source: unknown
  8564 1726773032.39628: variable 'ansible_shell_type' from source: unknown
  8564 1726773032.39631: variable 'ansible_shell_executable' from source: unknown
  8564 1726773032.39635: variable 'ansible_host' from source: host vars for 'managed_node3'
  8564 1726773032.39639: variable 'ansible_pipelining' from source: unknown
  8564 1726773032.39642: variable 'ansible_timeout' from source: unknown
  8564 1726773032.39646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8564 1726773032.39709: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  8564 1726773032.39720: variable 'omit' from source: magic vars
  8564 1726773032.39726: starting attempt loop
  8564 1726773032.39730: running the handler
  8564 1726773032.39738: handler run complete
  8564 1726773032.39746: attempt loop complete, returning result
  8564 1726773032.39749: _execute() done
  8564 1726773032.39752: dumping result to json
  8564 1726773032.39755: done dumping result, returning
  8564 1726773032.39762: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir [0affffe7-6841-6cfb-81ae-00000000002d]
  8564 1726773032.39770: sending task result for task 0affffe7-6841-6cfb-81ae-00000000002d
  8564 1726773032.39793: done sending task result for task 0affffe7-6841-6cfb-81ae-00000000002d
  8564 1726773032.39796: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "ansible_facts": {
        "__kernel_settings_profile_parent": "/etc/tuned"
    },
    "changed": false
}
  8303 1726773032.39907: no more pending results, returning what we have
  8303 1726773032.39910: results queue empty
  8303 1726773032.39911: checking for any_errors_fatal
  8303 1726773032.39917: done checking for any_errors_fatal
  8303 1726773032.39918: checking for max_fail_percentage
  8303 1726773032.39919: done checking for max_fail_percentage
  8303 1726773032.39920: checking to see if all hosts have failed and the running result is not ok
  8303 1726773032.39920: done checking to see if all hosts have failed
  8303 1726773032.39921: getting the remaining hosts for this loop
  8303 1726773032.39922: done getting the remaining hosts for this loop
  8303 1726773032.39925: getting the next task for host managed_node3
  8303 1726773032.39929: done getting next task for host managed_node3
  8303 1726773032.39932:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started
  8303 1726773032.39934:  ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773032.39944: getting variables
  8303 1726773032.39945: in VariableManager get_vars()
  8303 1726773032.39974: Calling all_inventory to load vars for managed_node3
  8303 1726773032.39977: Calling groups_inventory to load vars for managed_node3
  8303 1726773032.39978: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773032.39988: Calling all_plugins_play to load vars for managed_node3
  8303 1726773032.39990: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773032.39993: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773032.40037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773032.40068: done with get_vars()
  8303 1726773032.40074: done getting variables
  8303 1726773032.40142: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67
Thursday 19 September 2024  15:10:32 -0400 (0:00:00.044)       0:00:08.980 **** 
  8303 1726773032.40167: entering _queue_task() for managed_node3/service
  8303 1726773032.40168: Creating lock for service
  8303 1726773032.40336: worker is 1 (out of 1 available)
  8303 1726773032.40349: exiting _queue_task() for managed_node3/service
  8303 1726773032.40359: done queuing things up, now waiting for results queue to drain
  8303 1726773032.40360: waiting for pending results...
  8565 1726773032.40468: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started
  8565 1726773032.40569: in run() - task 0affffe7-6841-6cfb-81ae-00000000002e
  8565 1726773032.40584: variable 'ansible_search_path' from source: unknown
  8565 1726773032.40590: variable 'ansible_search_path' from source: unknown
  8565 1726773032.40624: variable '__kernel_settings_services' from source: include_vars
  8565 1726773032.40845: variable '__kernel_settings_services' from source: include_vars
  8565 1726773032.40902: variable 'omit' from source: magic vars
  8565 1726773032.40977: variable 'ansible_host' from source: host vars for 'managed_node3'
  8565 1726773032.40987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8565 1726773032.40993: variable 'omit' from source: magic vars
  8565 1726773032.41044: variable 'omit' from source: magic vars
  8565 1726773032.41075: variable 'omit' from source: magic vars
  8565 1726773032.41108: variable 'item' from source: unknown
  8565 1726773032.41168: variable 'item' from source: unknown
  8565 1726773032.41191: variable 'omit' from source: magic vars
  8565 1726773032.41225: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  8565 1726773032.41251: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  8565 1726773032.41270: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  8565 1726773032.41322: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8565 1726773032.41332: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8565 1726773032.41357: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  8565 1726773032.41362: variable 'ansible_host' from source: host vars for 'managed_node3'
  8565 1726773032.41370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8565 1726773032.41435: Set connection var ansible_pipelining to False
  8565 1726773032.41446: Set connection var ansible_timeout to 10
  8565 1726773032.41452: Set connection var ansible_module_compression to ZIP_DEFLATED
  8565 1726773032.41455: Set connection var ansible_shell_executable to /bin/sh
  8565 1726773032.41458: Set connection var ansible_connection to ssh
  8565 1726773032.41463: Set connection var ansible_shell_type to sh
  8565 1726773032.41478: variable 'ansible_shell_executable' from source: unknown
  8565 1726773032.41481: variable 'ansible_connection' from source: unknown
  8565 1726773032.41483: variable 'ansible_module_compression' from source: unknown
  8565 1726773032.41486: variable 'ansible_shell_type' from source: unknown
  8565 1726773032.41489: variable 'ansible_shell_executable' from source: unknown
  8565 1726773032.41492: variable 'ansible_host' from source: host vars for 'managed_node3'
  8565 1726773032.41496: variable 'ansible_pipelining' from source: unknown
  8565 1726773032.41499: variable 'ansible_timeout' from source: unknown
  8565 1726773032.41503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8565 1726773032.41599: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  8565 1726773032.41610: variable 'omit' from source: magic vars
  8565 1726773032.41616: starting attempt loop
  8565 1726773032.41619: running the handler
  8565 1726773032.41684: variable 'ansible_facts' from source: unknown
  8565 1726773032.41717: _low_level_execute_command(): starting
  8565 1726773032.41725: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  8565 1726773032.44526: stdout chunk (state=2):
>>>/root
<<<
  8565 1726773032.44632: stderr chunk (state=3):
>>><<<
  8565 1726773032.44640: stdout chunk (state=3):
>>><<<
  8565 1726773032.44662: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  8565 1726773032.44679: _low_level_execute_command(): starting
  8565 1726773032.44693: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773032.4467342-8565-251192093223088 `" && echo ansible-tmp-1726773032.4467342-8565-251192093223088="` echo /root/.ansible/tmp/ansible-tmp-1726773032.4467342-8565-251192093223088 `" ) && sleep 0'
  8565 1726773032.47241: stdout chunk (state=2):
>>>ansible-tmp-1726773032.4467342-8565-251192093223088=/root/.ansible/tmp/ansible-tmp-1726773032.4467342-8565-251192093223088
<<<
  8565 1726773032.49091: stderr chunk (state=3):
>>><<<
  8565 1726773032.49102: stdout chunk (state=3):
>>><<<
  8565 1726773032.49122: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773032.4467342-8565-251192093223088=/root/.ansible/tmp/ansible-tmp-1726773032.4467342-8565-251192093223088
, stderr=
  8565 1726773032.49152: variable 'ansible_module_compression' from source: unknown
  8565 1726773032.49207: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED
  8565 1726773032.49268: variable 'ansible_facts' from source: unknown
  8565 1726773032.49481: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773032.4467342-8565-251192093223088/AnsiballZ_setup.py
  8565 1726773032.50207: Sending initial data
  8565 1726773032.50215: Sent initial data (152 bytes)
  8565 1726773032.53993: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmp9dxgxr0r /root/.ansible/tmp/ansible-tmp-1726773032.4467342-8565-251192093223088/AnsiballZ_setup.py
<<<
  8565 1726773032.57062: stderr chunk (state=3):
>>><<<
  8565 1726773032.57076: stdout chunk (state=3):
>>><<<
  8565 1726773032.57104: done transferring module to remote
  8565 1726773032.57118: _low_level_execute_command(): starting
  8565 1726773032.57124: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773032.4467342-8565-251192093223088/ /root/.ansible/tmp/ansible-tmp-1726773032.4467342-8565-251192093223088/AnsiballZ_setup.py && sleep 0'
  8565 1726773032.60246: stderr chunk (state=2):
>>><<<
  8565 1726773032.60258: stdout chunk (state=2):
>>><<<
  8565 1726773032.60274: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8565 1726773032.60279: _low_level_execute_command(): starting
  8565 1726773032.60287: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773032.4467342-8565-251192093223088/AnsiballZ_setup.py && sleep 0'
  8565 1726773032.88656: stdout chunk (state=2):
>>>
{"ansible_facts": {"ansible_service_mgr": "systemd"}, "invocation": {"module_args": {"gather_subset": ["!all"], "filter": ["ansible_service_mgr"], "gather_timeout": 10, "fact_path": "/etc/ansible/facts.d"}}}
<<<
  8565 1726773032.90380: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  8565 1726773032.90424: stderr chunk (state=3):
>>><<<
  8565 1726773032.90432: stdout chunk (state=3):
>>><<<
  8565 1726773032.90445: _low_level_execute_command() done: rc=0, stdout=
{"ansible_facts": {"ansible_service_mgr": "systemd"}, "invocation": {"module_args": {"gather_subset": ["!all"], "filter": ["ansible_service_mgr"], "gather_timeout": 10, "fact_path": "/etc/ansible/facts.d"}}}
, stderr=Shared connection to 10.31.47.99 closed.
  8565 1726773032.90474: done with _execute_module (ansible.legacy.setup, {'gather_subset': '!all', 'filter': 'ansible_service_mgr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773032.4467342-8565-251192093223088/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  8565 1726773032.90496: Facts {'ansible_facts': {'ansible_service_mgr': 'systemd'}, 'invocation': {'module_args': {'gather_subset': ['!all'], 'filter': ['ansible_service_mgr'], 'gather_timeout': 10, 'fact_path': '/etc/ansible/facts.d'}}, '_ansible_parsed': True}
  8565 1726773032.90550: variable 'ansible_module_compression' from source: unknown
  8565 1726773032.90588: ANSIBALLZ: Using generic lock for ansible.legacy.systemd
  8565 1726773032.90593: ANSIBALLZ: Acquiring lock
  8565 1726773032.90596: ANSIBALLZ: Lock acquired: 140242352720640
  8565 1726773032.90600: ANSIBALLZ: Creating module
  8565 1726773033.17711: ANSIBALLZ: Writing module into payload
  8565 1726773033.17927: ANSIBALLZ: Writing module
  8565 1726773033.17959: ANSIBALLZ: Renaming module
  8565 1726773033.17968: ANSIBALLZ: Done creating module
  8565 1726773033.18005: variable 'ansible_facts' from source: unknown
  8565 1726773033.18247: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773032.4467342-8565-251192093223088/AnsiballZ_systemd.py
  8565 1726773033.19392: Sending initial data
  8565 1726773033.19400: Sent initial data (154 bytes)
  8565 1726773033.23106: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpbst5q91z /root/.ansible/tmp/ansible-tmp-1726773032.4467342-8565-251192093223088/AnsiballZ_systemd.py
<<<
  8565 1726773033.27332: stderr chunk (state=3):
>>><<<
  8565 1726773033.27344: stdout chunk (state=3):
>>><<<
  8565 1726773033.27372: done transferring module to remote
  8565 1726773033.27387: _low_level_execute_command(): starting
  8565 1726773033.27394: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773032.4467342-8565-251192093223088/ /root/.ansible/tmp/ansible-tmp-1726773032.4467342-8565-251192093223088/AnsiballZ_systemd.py && sleep 0'
  8565 1726773033.30447: stderr chunk (state=2):
>>><<<
  8565 1726773033.30458: stdout chunk (state=2):
>>><<<
  8565 1726773033.30479: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8565 1726773033.30487: _low_level_execute_command(): starting
  8565 1726773033.30493: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773032.4467342-8565-251192093223088/AnsiballZ_systemd.py && sleep 0'
  8565 1726773033.59854: stdout chunk (state=2):
>>>
{"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:03:38 EDT", "WatchdogTimestampMonotonic": "33506369", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "664", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:03:37 EDT", "ExecMainStartTimestampMonotonic": "32243396", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "664", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:03:37 EDT] ; stop_time=[n/a] ; pid=664 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18628608", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "power-profiles-daemon.service auto-cpufreq.service tlp.service cpupower.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "systemd-journald.socket polkit.service systemd-sysctl.service dbus.socket basic.target system.slice network.target dbus.service sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:03:38 EDT", "StateChangeTimestampMonotonic": "33506375", "InactiveExitTimestamp": "Thu 2024-09-19 15:03:37 EDT", "InactiveExitTimestampMonotonic": "32243440", "ActiveEnterTimestamp": "Thu 2024-09-19 15:03:38 EDT", "ActiveEnterTimestampMonotonic": "33506375", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:03:37 EDT", "ConditionTimestampMonotonic": "32242529", "AssertTimestamp": "Thu 2024-09-19 15:03:37 EDT", "AssertTimestampMonotonic": "32242534", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "7582752b17874324b2c9dc01ae0a603c", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}}
<<<
  8565 1726773033.61277: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  8565 1726773033.61291: stdout chunk (state=3):
>>><<<
  8565 1726773033.61304: stderr chunk (state=3):
>>><<<
  8565 1726773033.61324: _low_level_execute_command() done: rc=0, stdout=
{"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:03:38 EDT", "WatchdogTimestampMonotonic": "33506369", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "664", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:03:37 EDT", "ExecMainStartTimestampMonotonic": "32243396", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "664", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:03:37 EDT] ; stop_time=[n/a] ; pid=664 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18628608", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "power-profiles-daemon.service auto-cpufreq.service tlp.service cpupower.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "systemd-journald.socket polkit.service systemd-sysctl.service dbus.socket basic.target system.slice network.target dbus.service sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:03:38 EDT", "StateChangeTimestampMonotonic": "33506375", "InactiveExitTimestamp": "Thu 2024-09-19 15:03:37 EDT", "InactiveExitTimestampMonotonic": "32243440", "ActiveEnterTimestamp": "Thu 2024-09-19 15:03:38 EDT", "ActiveEnterTimestampMonotonic": "33506375", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:03:37 EDT", "ConditionTimestampMonotonic": "32242529", "AssertTimestamp": "Thu 2024-09-19 15:03:37 EDT", "AssertTimestampMonotonic": "32242534", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "7582752b17874324b2c9dc01ae0a603c", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}}
, stderr=Shared connection to 10.31.47.99 closed.
  8565 1726773033.61476: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773032.4467342-8565-251192093223088/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  8565 1726773033.61502: _low_level_execute_command(): starting
  8565 1726773033.61510: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773032.4467342-8565-251192093223088/ > /dev/null 2>&1 && sleep 0'
  8565 1726773033.64221: stderr chunk (state=2):
>>><<<
  8565 1726773033.64231: stdout chunk (state=2):
>>><<<
  8565 1726773033.64247: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8565 1726773033.64254: handler run complete
  8565 1726773033.64289: attempt loop complete, returning result
  8565 1726773033.64307: variable 'item' from source: unknown
  8565 1726773033.64372: variable 'item' from source: unknown
ok: [managed_node3] => (item=tuned) => {
    "ansible_loop_var": "item",
    "changed": false,
    "enabled": true,
    "item": "tuned",
    "name": "tuned",
    "state": "started",
    "status": {
        "ActiveEnterTimestamp": "Thu 2024-09-19 15:03:38 EDT",
        "ActiveEnterTimestampMonotonic": "33506375",
        "ActiveExitTimestampMonotonic": "0",
        "ActiveState": "active",
        "After": "systemd-journald.socket polkit.service systemd-sysctl.service dbus.socket basic.target system.slice network.target dbus.service sysinit.target",
        "AllowIsolate": "no",
        "AllowedCPUs": "",
        "AllowedMemoryNodes": "",
        "AmbientCapabilities": "",
        "AssertResult": "yes",
        "AssertTimestamp": "Thu 2024-09-19 15:03:37 EDT",
        "AssertTimestampMonotonic": "32242534",
        "Before": "shutdown.target multi-user.target",
        "BlockIOAccounting": "no",
        "BlockIOWeight": "[not set]",
        "BusName": "com.redhat.tuned",
        "CPUAccounting": "no",
        "CPUAffinity": "",
        "CPUAffinityFromNUMA": "no",
        "CPUQuotaPerSecUSec": "infinity",
        "CPUQuotaPeriodUSec": "infinity",
        "CPUSchedulingPolicy": "0",
        "CPUSchedulingPriority": "0",
        "CPUSchedulingResetOnFork": "no",
        "CPUShares": "[not set]",
        "CPUUsageNSec": "[not set]",
        "CPUWeight": "[not set]",
        "CacheDirectoryMode": "0755",
        "CanFreeze": "yes",
        "CanIsolate": "no",
        "CanReload": "no",
        "CanStart": "yes",
        "CanStop": "yes",
        "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf",
        "CollectMode": "inactive",
        "ConditionResult": "yes",
        "ConditionTimestamp": "Thu 2024-09-19 15:03:37 EDT",
        "ConditionTimestampMonotonic": "32242529",
        "ConfigurationDirectoryMode": "0755",
        "Conflicts": "power-profiles-daemon.service auto-cpufreq.service tlp.service cpupower.service shutdown.target",
        "ControlGroup": "/system.slice/tuned.service",
        "ControlPID": "0",
        "DefaultDependencies": "yes",
        "DefaultMemoryLow": "0",
        "DefaultMemoryMin": "0",
        "Delegate": "no",
        "Description": "Dynamic System Tuning Daemon",
        "DevicePolicy": "auto",
        "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)",
        "DynamicUser": "no",
        "EffectiveCPUs": "",
        "EffectiveMemoryNodes": "",
        "ExecMainCode": "0",
        "ExecMainExitTimestampMonotonic": "0",
        "ExecMainPID": "664",
        "ExecMainStartTimestamp": "Thu 2024-09-19 15:03:37 EDT",
        "ExecMainStartTimestampMonotonic": "32243396",
        "ExecMainStatus": "0",
        "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:03:37 EDT] ; stop_time=[n/a] ; pid=664 ; code=(null) ; status=0/0 }",
        "FailureAction": "none",
        "FileDescriptorStoreMax": "0",
        "FragmentPath": "/usr/lib/systemd/system/tuned.service",
        "FreezerState": "running",
        "GID": "[not set]",
        "GuessMainPID": "yes",
        "IOAccounting": "no",
        "IOSchedulingClass": "0",
        "IOSchedulingPriority": "0",
        "IOWeight": "[not set]",
        "IPAccounting": "no",
        "IPEgressBytes": "18446744073709551615",
        "IPEgressPackets": "18446744073709551615",
        "IPIngressBytes": "18446744073709551615",
        "IPIngressPackets": "18446744073709551615",
        "Id": "tuned.service",
        "IgnoreOnIsolate": "no",
        "IgnoreSIGPIPE": "yes",
        "InactiveEnterTimestampMonotonic": "0",
        "InactiveExitTimestamp": "Thu 2024-09-19 15:03:37 EDT",
        "InactiveExitTimestampMonotonic": "32243440",
        "InvocationID": "7582752b17874324b2c9dc01ae0a603c",
        "JobRunningTimeoutUSec": "infinity",
        "JobTimeoutAction": "none",
        "JobTimeoutUSec": "infinity",
        "KeyringMode": "private",
        "KillMode": "control-group",
        "KillSignal": "15",
        "LimitAS": "infinity",
        "LimitASSoft": "infinity",
        "LimitCORE": "infinity",
        "LimitCORESoft": "0",
        "LimitCPU": "infinity",
        "LimitCPUSoft": "infinity",
        "LimitDATA": "infinity",
        "LimitDATASoft": "infinity",
        "LimitFSIZE": "infinity",
        "LimitFSIZESoft": "infinity",
        "LimitLOCKS": "infinity",
        "LimitLOCKSSoft": "infinity",
        "LimitMEMLOCK": "65536",
        "LimitMEMLOCKSoft": "65536",
        "LimitMSGQUEUE": "819200",
        "LimitMSGQUEUESoft": "819200",
        "LimitNICE": "0",
        "LimitNICESoft": "0",
        "LimitNOFILE": "262144",
        "LimitNOFILESoft": "1024",
        "LimitNPROC": "14003",
        "LimitNPROCSoft": "14003",
        "LimitRSS": "infinity",
        "LimitRSSSoft": "infinity",
        "LimitRTPRIO": "0",
        "LimitRTPRIOSoft": "0",
        "LimitRTTIME": "infinity",
        "LimitRTTIMESoft": "infinity",
        "LimitSIGPENDING": "14003",
        "LimitSIGPENDINGSoft": "14003",
        "LimitSTACK": "infinity",
        "LimitSTACKSoft": "8388608",
        "LoadState": "loaded",
        "LockPersonality": "no",
        "LogLevelMax": "-1",
        "LogRateLimitBurst": "0",
        "LogRateLimitIntervalUSec": "0",
        "LogsDirectoryMode": "0755",
        "MainPID": "664",
        "MemoryAccounting": "yes",
        "MemoryCurrent": "18628608",
        "MemoryDenyWriteExecute": "no",
        "MemoryHigh": "infinity",
        "MemoryLimit": "infinity",
        "MemoryLow": "0",
        "MemoryMax": "infinity",
        "MemoryMin": "0",
        "MemorySwapMax": "infinity",
        "MountAPIVFS": "no",
        "MountFlags": "",
        "NFileDescriptorStore": "0",
        "NRestarts": "0",
        "NUMAMask": "",
        "NUMAPolicy": "n/a",
        "Names": "tuned.service",
        "NeedDaemonReload": "no",
        "Nice": "0",
        "NoNewPrivileges": "no",
        "NonBlocking": "no",
        "NotifyAccess": "none",
        "OOMScoreAdjust": "0",
        "OnFailureJobMode": "replace",
        "PIDFile": "/run/tuned/tuned.pid",
        "PermissionsStartOnly": "no",
        "Perpetual": "no",
        "PrivateDevices": "no",
        "PrivateMounts": "no",
        "PrivateNetwork": "no",
        "PrivateTmp": "no",
        "PrivateUsers": "no",
        "ProtectControlGroups": "no",
        "ProtectHome": "no",
        "ProtectKernelModules": "no",
        "ProtectKernelTunables": "no",
        "ProtectSystem": "no",
        "RefuseManualStart": "no",
        "RefuseManualStop": "no",
        "RemainAfterExit": "no",
        "RemoveIPC": "no",
        "Requires": "system.slice sysinit.target dbus.service dbus.socket",
        "Restart": "no",
        "RestartUSec": "100ms",
        "RestrictNamespaces": "no",
        "RestrictRealtime": "no",
        "RestrictSUIDSGID": "no",
        "Result": "success",
        "RootDirectoryStartOnly": "no",
        "RuntimeDirectoryMode": "0755",
        "RuntimeDirectoryPreserve": "no",
        "RuntimeMaxUSec": "infinity",
        "SameProcessGroup": "no",
        "SecureBits": "0",
        "SendSIGHUP": "no",
        "SendSIGKILL": "yes",
        "Slice": "system.slice",
        "StandardError": "inherit",
        "StandardInput": "null",
        "StandardInputData": "",
        "StandardOutput": "journal",
        "StartLimitAction": "none",
        "StartLimitBurst": "5",
        "StartLimitIntervalUSec": "10s",
        "StartupBlockIOWeight": "[not set]",
        "StartupCPUShares": "[not set]",
        "StartupCPUWeight": "[not set]",
        "StartupIOWeight": "[not set]",
        "StateChangeTimestamp": "Thu 2024-09-19 15:03:38 EDT",
        "StateChangeTimestampMonotonic": "33506375",
        "StateDirectoryMode": "0755",
        "StatusErrno": "0",
        "StopWhenUnneeded": "no",
        "SubState": "running",
        "SuccessAction": "none",
        "SyslogFacility": "3",
        "SyslogLevel": "6",
        "SyslogLevelPrefix": "yes",
        "SyslogPriority": "30",
        "SystemCallErrorNumber": "0",
        "TTYReset": "no",
        "TTYVHangup": "no",
        "TTYVTDisallocate": "no",
        "TasksAccounting": "yes",
        "TasksCurrent": "4",
        "TasksMax": "22405",
        "TimeoutStartUSec": "1min 30s",
        "TimeoutStopUSec": "1min 30s",
        "TimerSlackNSec": "50000",
        "Transient": "no",
        "Type": "dbus",
        "UID": "[not set]",
        "UMask": "0022",
        "UnitFilePreset": "enabled",
        "UnitFileState": "enabled",
        "UtmpMode": "init",
        "WantedBy": "multi-user.target",
        "WatchdogTimestamp": "Thu 2024-09-19 15:03:38 EDT",
        "WatchdogTimestampMonotonic": "33506369",
        "WatchdogUSec": "0"
    }
}
  8565 1726773033.65463: dumping result to json
  8565 1726773033.65483: done dumping result, returning
  8565 1726773033.65494: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started [0affffe7-6841-6cfb-81ae-00000000002e]
  8565 1726773033.65500: sending task result for task 0affffe7-6841-6cfb-81ae-00000000002e
  8565 1726773033.65609: done sending task result for task 0affffe7-6841-6cfb-81ae-00000000002e
  8565 1726773033.65614: WORKER PROCESS EXITING
  8303 1726773033.65959: no more pending results, returning what we have
  8303 1726773033.65962: results queue empty
  8303 1726773033.65962: checking for any_errors_fatal
  8303 1726773033.65966: done checking for any_errors_fatal
  8303 1726773033.65967: checking for max_fail_percentage
  8303 1726773033.65967: done checking for max_fail_percentage
  8303 1726773033.65968: checking to see if all hosts have failed and the running result is not ok
  8303 1726773033.65968: done checking to see if all hosts have failed
  8303 1726773033.65969: getting the remaining hosts for this loop
  8303 1726773033.65969: done getting the remaining hosts for this loop
  8303 1726773033.65971: getting the next task for host managed_node3
  8303 1726773033.65975: done getting next task for host managed_node3
  8303 1726773033.65977:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists
  8303 1726773033.65979:  ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773033.65987: getting variables
  8303 1726773033.65988: in VariableManager get_vars()
  8303 1726773033.66010: Calling all_inventory to load vars for managed_node3
  8303 1726773033.66012: Calling groups_inventory to load vars for managed_node3
  8303 1726773033.66013: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773033.66019: Calling all_plugins_play to load vars for managed_node3
  8303 1726773033.66021: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773033.66023: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773033.66055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773033.66082: done with get_vars()
  8303 1726773033.66092: done getting variables

TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74
Thursday 19 September 2024  15:10:33 -0400 (0:00:01.259)       0:00:10.240 **** 
  8303 1726773033.66157: entering _queue_task() for managed_node3/file
  8303 1726773033.66326: worker is 1 (out of 1 available)
  8303 1726773033.66339: exiting _queue_task() for managed_node3/file
  8303 1726773033.66350: done queuing things up, now waiting for results queue to drain
  8303 1726773033.66352: waiting for pending results...
  8635 1726773033.66460: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists
  8635 1726773033.66560: in run() - task 0affffe7-6841-6cfb-81ae-00000000002f
  8635 1726773033.66575: variable 'ansible_search_path' from source: unknown
  8635 1726773033.66579: variable 'ansible_search_path' from source: unknown
  8635 1726773033.66608: calling self._execute()
  8635 1726773033.66654: variable 'ansible_host' from source: host vars for 'managed_node3'
  8635 1726773033.66661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8635 1726773033.66668: variable 'omit' from source: magic vars
  8635 1726773033.66745: variable 'omit' from source: magic vars
  8635 1726773033.66778: variable 'omit' from source: magic vars
  8635 1726773033.66798: variable '__kernel_settings_profile_dir' from source: role '' all vars
  8635 1726773033.67012: variable '__kernel_settings_profile_dir' from source: role '' all vars
  8635 1726773033.67087: variable '__kernel_settings_profile_parent' from source: set_fact
  8635 1726773033.67097: variable '__kernel_settings_tuned_profile' from source: role '' all vars
  8635 1726773033.67129: variable 'omit' from source: magic vars
  8635 1726773033.67163: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  8635 1726773033.67192: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  8635 1726773033.67210: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  8635 1726773033.67223: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8635 1726773033.67234: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8635 1726773033.67257: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  8635 1726773033.67263: variable 'ansible_host' from source: host vars for 'managed_node3'
  8635 1726773033.67267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8635 1726773033.67334: Set connection var ansible_pipelining to False
  8635 1726773033.67343: Set connection var ansible_timeout to 10
  8635 1726773033.67347: Set connection var ansible_module_compression to ZIP_DEFLATED
  8635 1726773033.67351: Set connection var ansible_shell_executable to /bin/sh
  8635 1726773033.67354: Set connection var ansible_connection to ssh
  8635 1726773033.67358: Set connection var ansible_shell_type to sh
  8635 1726773033.67371: variable 'ansible_shell_executable' from source: unknown
  8635 1726773033.67374: variable 'ansible_connection' from source: unknown
  8635 1726773033.67376: variable 'ansible_module_compression' from source: unknown
  8635 1726773033.67378: variable 'ansible_shell_type' from source: unknown
  8635 1726773033.67380: variable 'ansible_shell_executable' from source: unknown
  8635 1726773033.67381: variable 'ansible_host' from source: host vars for 'managed_node3'
  8635 1726773033.67384: variable 'ansible_pipelining' from source: unknown
  8635 1726773033.67400: variable 'ansible_timeout' from source: unknown
  8635 1726773033.67405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8635 1726773033.67545: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action)
  8635 1726773033.67556: variable 'omit' from source: magic vars
  8635 1726773033.67563: starting attempt loop
  8635 1726773033.67570: running the handler
  8635 1726773033.67581: _low_level_execute_command(): starting
  8635 1726773033.67589: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  8635 1726773033.69988: stdout chunk (state=2):
>>>/root
<<<
  8635 1726773033.70360: stderr chunk (state=3):
>>><<<
  8635 1726773033.70371: stdout chunk (state=3):
>>><<<
  8635 1726773033.70395: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  8635 1726773033.70411: _low_level_execute_command(): starting
  8635 1726773033.70418: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773033.7040386-8635-87043009928552 `" && echo ansible-tmp-1726773033.7040386-8635-87043009928552="` echo /root/.ansible/tmp/ansible-tmp-1726773033.7040386-8635-87043009928552 `" ) && sleep 0'
  8635 1726773033.73563: stdout chunk (state=2):
>>>ansible-tmp-1726773033.7040386-8635-87043009928552=/root/.ansible/tmp/ansible-tmp-1726773033.7040386-8635-87043009928552
<<<
  8635 1726773033.73721: stderr chunk (state=3):
>>><<<
  8635 1726773033.73731: stdout chunk (state=3):
>>><<<
  8635 1726773033.73748: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773033.7040386-8635-87043009928552=/root/.ansible/tmp/ansible-tmp-1726773033.7040386-8635-87043009928552
, stderr=
  8635 1726773033.73796: variable 'ansible_module_compression' from source: unknown
  8635 1726773033.73855: ANSIBALLZ: Using lock for file
  8635 1726773033.73861: ANSIBALLZ: Acquiring lock
  8635 1726773033.73864: ANSIBALLZ: Lock acquired: 140242353218400
  8635 1726773033.73868: ANSIBALLZ: Creating module
  8635 1726773033.85549: ANSIBALLZ: Writing module into payload
  8635 1726773033.85703: ANSIBALLZ: Writing module
  8635 1726773033.85724: ANSIBALLZ: Renaming module
  8635 1726773033.85732: ANSIBALLZ: Done creating module
  8635 1726773033.85749: variable 'ansible_facts' from source: unknown
  8635 1726773033.85809: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773033.7040386-8635-87043009928552/AnsiballZ_file.py
  8635 1726773033.85914: Sending initial data
  8635 1726773033.85921: Sent initial data (150 bytes)
  8635 1726773033.88606: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpsr3_y52f /root/.ansible/tmp/ansible-tmp-1726773033.7040386-8635-87043009928552/AnsiballZ_file.py
<<<
  8635 1726773033.89833: stderr chunk (state=3):
>>><<<
  8635 1726773033.89844: stdout chunk (state=3):
>>><<<
  8635 1726773033.89864: done transferring module to remote
  8635 1726773033.89876: _low_level_execute_command(): starting
  8635 1726773033.89881: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773033.7040386-8635-87043009928552/ /root/.ansible/tmp/ansible-tmp-1726773033.7040386-8635-87043009928552/AnsiballZ_file.py && sleep 0'
  8635 1726773033.92354: stderr chunk (state=2):
>>><<<
  8635 1726773033.92363: stdout chunk (state=2):
>>><<<
  8635 1726773033.92378: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8635 1726773033.92383: _low_level_execute_command(): starting
  8635 1726773033.92390: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773033.7040386-8635-87043009928552/AnsiballZ_file.py && sleep 0'
  8635 1726773034.08558: stdout chunk (state=2):
>>>
{"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "absent"}, "after": {"path": "/etc/tuned/kernel_settings", "state": "directory"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 6, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}}
<<<
  8635 1726773034.09710: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  8635 1726773034.09757: stderr chunk (state=3):
>>><<<
  8635 1726773034.09769: stdout chunk (state=3):
>>><<<
  8635 1726773034.09789: _low_level_execute_command() done: rc=0, stdout=
{"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "absent"}, "after": {"path": "/etc/tuned/kernel_settings", "state": "directory"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 6, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}}
, stderr=Shared connection to 10.31.47.99 closed.
  8635 1726773034.09823: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'directory', 'mode': '0755', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773033.7040386-8635-87043009928552/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  8635 1726773034.09833: _low_level_execute_command(): starting
  8635 1726773034.09841: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773033.7040386-8635-87043009928552/ > /dev/null 2>&1 && sleep 0'
  8635 1726773034.12507: stderr chunk (state=2):
>>><<<
  8635 1726773034.12518: stdout chunk (state=2):
>>><<<
  8635 1726773034.12533: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8635 1726773034.12539: handler run complete
  8635 1726773034.12558: attempt loop complete, returning result
  8635 1726773034.12562: _execute() done
  8635 1726773034.12565: dumping result to json
  8635 1726773034.12570: done dumping result, returning
  8635 1726773034.12578: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists [0affffe7-6841-6cfb-81ae-00000000002f]
  8635 1726773034.12584: sending task result for task 0affffe7-6841-6cfb-81ae-00000000002f
  8635 1726773034.12617: done sending task result for task 0affffe7-6841-6cfb-81ae-00000000002f
  8635 1726773034.12620: WORKER PROCESS EXITING
changed: [managed_node3] => {
    "changed": true,
    "gid": 0,
    "group": "root",
    "mode": "0755",
    "owner": "root",
    "path": "/etc/tuned/kernel_settings",
    "secontext": "unconfined_u:object_r:tuned_etc_t:s0",
    "size": 6,
    "state": "directory",
    "uid": 0
}
  8303 1726773034.12774: no more pending results, returning what we have
  8303 1726773034.12777: results queue empty
  8303 1726773034.12778: checking for any_errors_fatal
  8303 1726773034.12792: done checking for any_errors_fatal
  8303 1726773034.12792: checking for max_fail_percentage
  8303 1726773034.12794: done checking for max_fail_percentage
  8303 1726773034.12794: checking to see if all hosts have failed and the running result is not ok
  8303 1726773034.12795: done checking to see if all hosts have failed
  8303 1726773034.12796: getting the remaining hosts for this loop
  8303 1726773034.12797: done getting the remaining hosts for this loop
  8303 1726773034.12800: getting the next task for host managed_node3
  8303 1726773034.12805: done getting next task for host managed_node3
  8303 1726773034.12808:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile
  8303 1726773034.12810:  ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773034.12818: getting variables
  8303 1726773034.12819: in VariableManager get_vars()
  8303 1726773034.12849: Calling all_inventory to load vars for managed_node3
  8303 1726773034.12852: Calling groups_inventory to load vars for managed_node3
  8303 1726773034.12854: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773034.12862: Calling all_plugins_play to load vars for managed_node3
  8303 1726773034.12867: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773034.12869: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773034.12916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773034.12957: done with get_vars()
  8303 1726773034.12967: done getting variables

TASK [fedora.linux_system_roles.kernel_settings : Get active_profile] **********
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:80
Thursday 19 September 2024  15:10:34 -0400 (0:00:00.468)       0:00:10.709 **** 
  8303 1726773034.13053: entering _queue_task() for managed_node3/slurp
  8303 1726773034.13258: worker is 1 (out of 1 available)
  8303 1726773034.13274: exiting _queue_task() for managed_node3/slurp
  8303 1726773034.13289: done queuing things up, now waiting for results queue to drain
  8303 1726773034.13290: waiting for pending results...
  8654 1726773034.13490: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile
  8654 1726773034.13615: in run() - task 0affffe7-6841-6cfb-81ae-000000000030
  8654 1726773034.13632: variable 'ansible_search_path' from source: unknown
  8654 1726773034.13636: variable 'ansible_search_path' from source: unknown
  8654 1726773034.13672: calling self._execute()
  8654 1726773034.13736: variable 'ansible_host' from source: host vars for 'managed_node3'
  8654 1726773034.13745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8654 1726773034.13754: variable 'omit' from source: magic vars
  8654 1726773034.13846: variable 'omit' from source: magic vars
  8654 1726773034.13886: variable 'omit' from source: magic vars
  8654 1726773034.13906: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars
  8654 1726773034.14125: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars
  8654 1726773034.14188: variable '__kernel_settings_tuned_dir' from source: role '' all vars
  8654 1726773034.14216: variable 'omit' from source: magic vars
  8654 1726773034.14249: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  8654 1726773034.14277: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  8654 1726773034.14296: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  8654 1726773034.14311: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8654 1726773034.14323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8654 1726773034.14346: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  8654 1726773034.14351: variable 'ansible_host' from source: host vars for 'managed_node3'
  8654 1726773034.14354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8654 1726773034.14431: Set connection var ansible_pipelining to False
  8654 1726773034.14441: Set connection var ansible_timeout to 10
  8654 1726773034.14447: Set connection var ansible_module_compression to ZIP_DEFLATED
  8654 1726773034.14453: Set connection var ansible_shell_executable to /bin/sh
  8654 1726773034.14456: Set connection var ansible_connection to ssh
  8654 1726773034.14464: Set connection var ansible_shell_type to sh
  8654 1726773034.14481: variable 'ansible_shell_executable' from source: unknown
  8654 1726773034.14486: variable 'ansible_connection' from source: unknown
  8654 1726773034.14490: variable 'ansible_module_compression' from source: unknown
  8654 1726773034.14493: variable 'ansible_shell_type' from source: unknown
  8654 1726773034.14497: variable 'ansible_shell_executable' from source: unknown
  8654 1726773034.14500: variable 'ansible_host' from source: host vars for 'managed_node3'
  8654 1726773034.14504: variable 'ansible_pipelining' from source: unknown
  8654 1726773034.14507: variable 'ansible_timeout' from source: unknown
  8654 1726773034.14512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8654 1726773034.14649: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action)
  8654 1726773034.14660: variable 'omit' from source: magic vars
  8654 1726773034.14668: starting attempt loop
  8654 1726773034.14671: running the handler
  8654 1726773034.14682: _low_level_execute_command(): starting
  8654 1726773034.14691: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  8654 1726773034.17041: stdout chunk (state=2):
>>>/root
<<<
  8654 1726773034.17161: stderr chunk (state=3):
>>><<<
  8654 1726773034.17171: stdout chunk (state=3):
>>><<<
  8654 1726773034.17191: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  8654 1726773034.17205: _low_level_execute_command(): starting
  8654 1726773034.17210: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773034.171991-8654-10902533376869 `" && echo ansible-tmp-1726773034.171991-8654-10902533376869="` echo /root/.ansible/tmp/ansible-tmp-1726773034.171991-8654-10902533376869 `" ) && sleep 0'
  8654 1726773034.19730: stdout chunk (state=2):
>>>ansible-tmp-1726773034.171991-8654-10902533376869=/root/.ansible/tmp/ansible-tmp-1726773034.171991-8654-10902533376869
<<<
  8654 1726773034.19867: stderr chunk (state=3):
>>><<<
  8654 1726773034.19875: stdout chunk (state=3):
>>><<<
  8654 1726773034.19892: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773034.171991-8654-10902533376869=/root/.ansible/tmp/ansible-tmp-1726773034.171991-8654-10902533376869
, stderr=
  8654 1726773034.19931: variable 'ansible_module_compression' from source: unknown
  8654 1726773034.19972: ANSIBALLZ: Using lock for slurp
  8654 1726773034.19977: ANSIBALLZ: Acquiring lock
  8654 1726773034.19980: ANSIBALLZ: Lock acquired: 140242352721456
  8654 1726773034.19984: ANSIBALLZ: Creating module
  8654 1726773034.29762: ANSIBALLZ: Writing module into payload
  8654 1726773034.29818: ANSIBALLZ: Writing module
  8654 1726773034.29840: ANSIBALLZ: Renaming module
  8654 1726773034.29847: ANSIBALLZ: Done creating module
  8654 1726773034.29863: variable 'ansible_facts' from source: unknown
  8654 1726773034.29922: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773034.171991-8654-10902533376869/AnsiballZ_slurp.py
  8654 1726773034.30026: Sending initial data
  8654 1726773034.30034: Sent initial data (150 bytes)
  8654 1726773034.32707: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmp5ptv9lr4 /root/.ansible/tmp/ansible-tmp-1726773034.171991-8654-10902533376869/AnsiballZ_slurp.py
<<<
  8654 1726773034.36393: stderr chunk (state=3):
>>><<<
  8654 1726773034.36404: stdout chunk (state=3):
>>><<<
  8654 1726773034.36427: done transferring module to remote
  8654 1726773034.36439: _low_level_execute_command(): starting
  8654 1726773034.36445: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773034.171991-8654-10902533376869/ /root/.ansible/tmp/ansible-tmp-1726773034.171991-8654-10902533376869/AnsiballZ_slurp.py && sleep 0'
  8654 1726773034.39230: stderr chunk (state=2):
>>><<<
  8654 1726773034.39239: stdout chunk (state=2):
>>><<<
  8654 1726773034.39255: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8654 1726773034.39259: _low_level_execute_command(): starting
  8654 1726773034.39265: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773034.171991-8654-10902533376869/AnsiballZ_slurp.py && sleep 0'
  8654 1726773034.54149: stdout chunk (state=2):
>>>
{"content": "dmlydHVhbC1ndWVzdAo=", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}}
<<<
  8654 1726773034.55220: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  8654 1726773034.55268: stderr chunk (state=3):
>>><<<
  8654 1726773034.55276: stdout chunk (state=3):
>>><<<
  8654 1726773034.55294: _low_level_execute_command() done: rc=0, stdout=
{"content": "dmlydHVhbC1ndWVzdAo=", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}}
, stderr=Shared connection to 10.31.47.99 closed.
  8654 1726773034.55319: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773034.171991-8654-10902533376869/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  8654 1726773034.55329: _low_level_execute_command(): starting
  8654 1726773034.55335: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773034.171991-8654-10902533376869/ > /dev/null 2>&1 && sleep 0'
  8654 1726773034.57829: stderr chunk (state=2):
>>><<<
  8654 1726773034.57841: stdout chunk (state=2):
>>><<<
  8654 1726773034.57856: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8654 1726773034.57862: handler run complete
  8654 1726773034.57876: attempt loop complete, returning result
  8654 1726773034.57880: _execute() done
  8654 1726773034.57883: dumping result to json
  8654 1726773034.57889: done dumping result, returning
  8654 1726773034.57896: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile [0affffe7-6841-6cfb-81ae-000000000030]
  8654 1726773034.57902: sending task result for task 0affffe7-6841-6cfb-81ae-000000000030
  8654 1726773034.57931: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000030
  8654 1726773034.57936: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "changed": false,
    "content": "dmlydHVhbC1ndWVzdAo=",
    "encoding": "base64",
    "source": "/etc/tuned/active_profile"
}
  8303 1726773034.58062: no more pending results, returning what we have
  8303 1726773034.58068: results queue empty
  8303 1726773034.58068: checking for any_errors_fatal
  8303 1726773034.58075: done checking for any_errors_fatal
  8303 1726773034.58076: checking for max_fail_percentage
  8303 1726773034.58077: done checking for max_fail_percentage
  8303 1726773034.58077: checking to see if all hosts have failed and the running result is not ok
  8303 1726773034.58078: done checking to see if all hosts have failed
  8303 1726773034.58078: getting the remaining hosts for this loop
  8303 1726773034.58080: done getting the remaining hosts for this loop
  8303 1726773034.58082: getting the next task for host managed_node3
  8303 1726773034.58088: done getting next task for host managed_node3
  8303 1726773034.58091:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile
  8303 1726773034.58094:  ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773034.58103: getting variables
  8303 1726773034.58104: in VariableManager get_vars()
  8303 1726773034.58132: Calling all_inventory to load vars for managed_node3
  8303 1726773034.58135: Calling groups_inventory to load vars for managed_node3
  8303 1726773034.58137: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773034.58145: Calling all_plugins_play to load vars for managed_node3
  8303 1726773034.58147: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773034.58150: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773034.58200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773034.58238: done with get_vars()
  8303 1726773034.58246: done getting variables
  8303 1726773034.58291: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Set active_profile] **********
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:85
Thursday 19 September 2024  15:10:34 -0400 (0:00:00.452)       0:00:11.161 **** 
  8303 1726773034.58314: entering _queue_task() for managed_node3/set_fact
  8303 1726773034.58478: worker is 1 (out of 1 available)
  8303 1726773034.58493: exiting _queue_task() for managed_node3/set_fact
  8303 1726773034.58504: done queuing things up, now waiting for results queue to drain
  8303 1726773034.58505: waiting for pending results...
  8677 1726773034.58609: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile
  8677 1726773034.58710: in run() - task 0affffe7-6841-6cfb-81ae-000000000031
  8677 1726773034.58727: variable 'ansible_search_path' from source: unknown
  8677 1726773034.58732: variable 'ansible_search_path' from source: unknown
  8677 1726773034.58759: calling self._execute()
  8677 1726773034.58809: variable 'ansible_host' from source: host vars for 'managed_node3'
  8677 1726773034.58819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8677 1726773034.58828: variable 'omit' from source: magic vars
  8677 1726773034.58900: variable 'omit' from source: magic vars
  8677 1726773034.58934: variable 'omit' from source: magic vars
  8677 1726773034.59213: variable '__kernel_settings_tuned_profile' from source: role '' all vars
  8677 1726773034.59223: variable '__cur_profile' from source: task vars
  8677 1726773034.59327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name
  8677 1726773034.60950: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py
  8677 1726773034.61001: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py
  8677 1726773034.61029: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py
  8677 1726773034.61057: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py
  8677 1726773034.61080: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py
  8677 1726773034.61137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  8677 1726773034.61156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  8677 1726773034.61175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  8677 1726773034.61204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  8677 1726773034.61213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  8677 1726773034.61287: variable '__kernel_settings_tuned_current_profile' from source: set_fact
  8677 1726773034.61322: variable 'omit' from source: magic vars
  8677 1726773034.61342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  8677 1726773034.61360: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  8677 1726773034.61376: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  8677 1726773034.61389: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8677 1726773034.61397: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8677 1726773034.61419: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  8677 1726773034.61423: variable 'ansible_host' from source: host vars for 'managed_node3'
  8677 1726773034.61425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8677 1726773034.61488: Set connection var ansible_pipelining to False
  8677 1726773034.61496: Set connection var ansible_timeout to 10
  8677 1726773034.61500: Set connection var ansible_module_compression to ZIP_DEFLATED
  8677 1726773034.61503: Set connection var ansible_shell_executable to /bin/sh
  8677 1726773034.61506: Set connection var ansible_connection to ssh
  8677 1726773034.61511: Set connection var ansible_shell_type to sh
  8677 1726773034.61525: variable 'ansible_shell_executable' from source: unknown
  8677 1726773034.61527: variable 'ansible_connection' from source: unknown
  8677 1726773034.61529: variable 'ansible_module_compression' from source: unknown
  8677 1726773034.61531: variable 'ansible_shell_type' from source: unknown
  8677 1726773034.61532: variable 'ansible_shell_executable' from source: unknown
  8677 1726773034.61534: variable 'ansible_host' from source: host vars for 'managed_node3'
  8677 1726773034.61537: variable 'ansible_pipelining' from source: unknown
  8677 1726773034.61539: variable 'ansible_timeout' from source: unknown
  8677 1726773034.61541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8677 1726773034.61598: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  8677 1726773034.61607: variable 'omit' from source: magic vars
  8677 1726773034.61611: starting attempt loop
  8677 1726773034.61614: running the handler
  8677 1726773034.61620: handler run complete
  8677 1726773034.61626: attempt loop complete, returning result
  8677 1726773034.61629: _execute() done
  8677 1726773034.61631: dumping result to json
  8677 1726773034.61633: done dumping result, returning
  8677 1726773034.61637: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile [0affffe7-6841-6cfb-81ae-000000000031]
  8677 1726773034.61640: sending task result for task 0affffe7-6841-6cfb-81ae-000000000031
  8677 1726773034.61656: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000031
  8677 1726773034.61658: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "ansible_facts": {
        "__kernel_settings_active_profile": "virtual-guest kernel_settings"
    },
    "changed": false
}
  8303 1726773034.61877: no more pending results, returning what we have
  8303 1726773034.61880: results queue empty
  8303 1726773034.61880: checking for any_errors_fatal
  8303 1726773034.61883: done checking for any_errors_fatal
  8303 1726773034.61884: checking for max_fail_percentage
  8303 1726773034.61887: done checking for max_fail_percentage
  8303 1726773034.61887: checking to see if all hosts have failed and the running result is not ok
  8303 1726773034.61888: done checking to see if all hosts have failed
  8303 1726773034.61888: getting the remaining hosts for this loop
  8303 1726773034.61889: done getting the remaining hosts for this loop
  8303 1726773034.61891: getting the next task for host managed_node3
  8303 1726773034.61895: done getting next task for host managed_node3
  8303 1726773034.61897:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile
  8303 1726773034.61899:  ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773034.61909: getting variables
  8303 1726773034.61910: in VariableManager get_vars()
  8303 1726773034.61934: Calling all_inventory to load vars for managed_node3
  8303 1726773034.61937: Calling groups_inventory to load vars for managed_node3
  8303 1726773034.61938: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773034.61944: Calling all_plugins_play to load vars for managed_node3
  8303 1726773034.61946: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773034.61948: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773034.61981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773034.62009: done with get_vars()
  8303 1726773034.62015: done getting variables
  8303 1726773034.62099: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91
Thursday 19 September 2024  15:10:34 -0400 (0:00:00.038)       0:00:11.199 **** 
  8303 1726773034.62121: entering _queue_task() for managed_node3/copy
  8303 1726773034.62272: worker is 1 (out of 1 available)
  8303 1726773034.62288: exiting _queue_task() for managed_node3/copy
  8303 1726773034.62298: done queuing things up, now waiting for results queue to drain
  8303 1726773034.62300: waiting for pending results...
  8678 1726773034.62405: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile
  8678 1726773034.62504: in run() - task 0affffe7-6841-6cfb-81ae-000000000032
  8678 1726773034.62521: variable 'ansible_search_path' from source: unknown
  8678 1726773034.62526: variable 'ansible_search_path' from source: unknown
  8678 1726773034.62553: calling self._execute()
  8678 1726773034.62601: variable 'ansible_host' from source: host vars for 'managed_node3'
  8678 1726773034.62611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8678 1726773034.62619: variable 'omit' from source: magic vars
  8678 1726773034.62693: variable 'omit' from source: magic vars
  8678 1726773034.62725: variable 'omit' from source: magic vars
  8678 1726773034.62748: variable '__kernel_settings_active_profile' from source: set_fact
  8678 1726773034.62953: variable '__kernel_settings_active_profile' from source: set_fact
  8678 1726773034.62977: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars
  8678 1726773034.63027: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars
  8678 1726773034.63086: variable '__kernel_settings_tuned_dir' from source: role '' all vars
  8678 1726773034.63109: variable 'omit' from source: magic vars
  8678 1726773034.63140: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  8678 1726773034.63169: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  8678 1726773034.63188: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  8678 1726773034.63248: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8678 1726773034.63260: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8678 1726773034.63287: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  8678 1726773034.63293: variable 'ansible_host' from source: host vars for 'managed_node3'
  8678 1726773034.63297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8678 1726773034.63358: Set connection var ansible_pipelining to False
  8678 1726773034.63371: Set connection var ansible_timeout to 10
  8678 1726773034.63378: Set connection var ansible_module_compression to ZIP_DEFLATED
  8678 1726773034.63384: Set connection var ansible_shell_executable to /bin/sh
  8678 1726773034.63389: Set connection var ansible_connection to ssh
  8678 1726773034.63393: Set connection var ansible_shell_type to sh
  8678 1726773034.63407: variable 'ansible_shell_executable' from source: unknown
  8678 1726773034.63410: variable 'ansible_connection' from source: unknown
  8678 1726773034.63412: variable 'ansible_module_compression' from source: unknown
  8678 1726773034.63414: variable 'ansible_shell_type' from source: unknown
  8678 1726773034.63415: variable 'ansible_shell_executable' from source: unknown
  8678 1726773034.63417: variable 'ansible_host' from source: host vars for 'managed_node3'
  8678 1726773034.63419: variable 'ansible_pipelining' from source: unknown
  8678 1726773034.63420: variable 'ansible_timeout' from source: unknown
  8678 1726773034.63422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8678 1726773034.63520: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  8678 1726773034.63530: variable 'omit' from source: magic vars
  8678 1726773034.63537: starting attempt loop
  8678 1726773034.63541: running the handler
  8678 1726773034.63549: _low_level_execute_command(): starting
  8678 1726773034.63556: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  8678 1726773034.65948: stdout chunk (state=2):
>>>/root
<<<
  8678 1726773034.66069: stderr chunk (state=3):
>>><<<
  8678 1726773034.66076: stdout chunk (state=3):
>>><<<
  8678 1726773034.66094: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  8678 1726773034.66107: _low_level_execute_command(): starting
  8678 1726773034.66113: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773034.6610146-8678-201693289049758 `" && echo ansible-tmp-1726773034.6610146-8678-201693289049758="` echo /root/.ansible/tmp/ansible-tmp-1726773034.6610146-8678-201693289049758 `" ) && sleep 0'
  8678 1726773034.68691: stdout chunk (state=2):
>>>ansible-tmp-1726773034.6610146-8678-201693289049758=/root/.ansible/tmp/ansible-tmp-1726773034.6610146-8678-201693289049758
<<<
  8678 1726773034.68820: stderr chunk (state=3):
>>><<<
  8678 1726773034.68829: stdout chunk (state=3):
>>><<<
  8678 1726773034.68845: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773034.6610146-8678-201693289049758=/root/.ansible/tmp/ansible-tmp-1726773034.6610146-8678-201693289049758
, stderr=
  8678 1726773034.68918: variable 'ansible_module_compression' from source: unknown
  8678 1726773034.68964: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED
  8678 1726773034.68996: variable 'ansible_facts' from source: unknown
  8678 1726773034.69065: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773034.6610146-8678-201693289049758/AnsiballZ_stat.py
  8678 1726773034.69155: Sending initial data
  8678 1726773034.69162: Sent initial data (151 bytes)
  8678 1726773034.71765: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmp31_mcyhh /root/.ansible/tmp/ansible-tmp-1726773034.6610146-8678-201693289049758/AnsiballZ_stat.py
<<<
  8678 1726773034.72951: stderr chunk (state=3):
>>><<<
  8678 1726773034.72962: stdout chunk (state=3):
>>><<<
  8678 1726773034.72981: done transferring module to remote
  8678 1726773034.72993: _low_level_execute_command(): starting
  8678 1726773034.72997: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773034.6610146-8678-201693289049758/ /root/.ansible/tmp/ansible-tmp-1726773034.6610146-8678-201693289049758/AnsiballZ_stat.py && sleep 0'
  8678 1726773034.75450: stderr chunk (state=2):
>>><<<
  8678 1726773034.75459: stdout chunk (state=2):
>>><<<
  8678 1726773034.75475: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8678 1726773034.75480: _low_level_execute_command(): starting
  8678 1726773034.75487: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773034.6610146-8678-201693289049758/AnsiballZ_stat.py && sleep 0'
  8678 1726773034.91426: stdout chunk (state=2):
>>>
{"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 14, "inode": 918059, "dev": 51713, "nlink": 1, "atime": 1726773034.53999, "mtime": 1726772618.4714715, "ctime": 1726772618.4714715, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "mimetype": "text/plain", "charset": "us-ascii", "version": "3521309722", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}}
<<<
  8678 1726773034.92601: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  8678 1726773034.92651: stderr chunk (state=3):
>>><<<
  8678 1726773034.92658: stdout chunk (state=3):
>>><<<
  8678 1726773034.92676: _low_level_execute_command() done: rc=0, stdout=
{"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 14, "inode": 918059, "dev": 51713, "nlink": 1, "atime": 1726773034.53999, "mtime": 1726772618.4714715, "ctime": 1726772618.4714715, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "mimetype": "text/plain", "charset": "us-ascii", "version": "3521309722", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}}
, stderr=Shared connection to 10.31.47.99 closed.
  8678 1726773034.92725: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773034.6610146-8678-201693289049758/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  8678 1726773034.92817: Sending initial data
  8678 1726773034.92825: Sent initial data (140 bytes)
  8678 1726773034.95469: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpfd3mm9j5 /root/.ansible/tmp/ansible-tmp-1726773034.6610146-8678-201693289049758/source
<<<
  8678 1726773034.95894: stderr chunk (state=3):
>>><<<
  8678 1726773034.95901: stdout chunk (state=3):
>>><<<
  8678 1726773034.95922: _low_level_execute_command(): starting
  8678 1726773034.95928: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773034.6610146-8678-201693289049758/ /root/.ansible/tmp/ansible-tmp-1726773034.6610146-8678-201693289049758/source && sleep 0'
  8678 1726773034.98339: stderr chunk (state=2):
>>><<<
  8678 1726773034.98348: stdout chunk (state=2):
>>><<<
  8678 1726773034.98362: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8678 1726773034.98383: variable 'ansible_module_compression' from source: unknown
  8678 1726773034.98421: ANSIBALLZ: Using generic lock for ansible.legacy.copy
  8678 1726773034.98426: ANSIBALLZ: Acquiring lock
  8678 1726773034.98431: ANSIBALLZ: Lock acquired: 140242352720640
  8678 1726773034.98435: ANSIBALLZ: Creating module
  8678 1726773035.08437: ANSIBALLZ: Writing module into payload
  8678 1726773035.08575: ANSIBALLZ: Writing module
  8678 1726773035.08598: ANSIBALLZ: Renaming module
  8678 1726773035.08605: ANSIBALLZ: Done creating module
  8678 1726773035.08618: variable 'ansible_facts' from source: unknown
  8678 1726773035.08674: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773034.6610146-8678-201693289049758/AnsiballZ_copy.py
  8678 1726773035.08773: Sending initial data
  8678 1726773035.08779: Sent initial data (151 bytes)
  8678 1726773035.11583: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmprwn51kxu /root/.ansible/tmp/ansible-tmp-1726773034.6610146-8678-201693289049758/AnsiballZ_copy.py
<<<
  8678 1726773035.13077: stderr chunk (state=3):
>>><<<
  8678 1726773035.13090: stdout chunk (state=3):
>>><<<
  8678 1726773035.13112: done transferring module to remote
  8678 1726773035.13121: _low_level_execute_command(): starting
  8678 1726773035.13127: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773034.6610146-8678-201693289049758/ /root/.ansible/tmp/ansible-tmp-1726773034.6610146-8678-201693289049758/AnsiballZ_copy.py && sleep 0'
  8678 1726773035.16361: stderr chunk (state=2):
>>><<<
  8678 1726773035.16373: stdout chunk (state=2):
>>><<<
  8678 1726773035.16394: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8678 1726773035.16400: _low_level_execute_command(): starting
  8678 1726773035.16406: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773034.6610146-8678-201693289049758/AnsiballZ_copy.py && sleep 0'
  8678 1726773035.33464: stdout chunk (state=2):
>>>
{"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726773034.6610146-8678-201693289049758/source", "md5sum": "8d80fe3f09ba4b9ac8d7fd5e8541a324", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773034.6610146-8678-201693289049758/source", "_original_basename": "tmpfd3mm9j5", "follow": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}}
<<<
  8678 1726773035.34698: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  8678 1726773035.34748: stderr chunk (state=3):
>>><<<
  8678 1726773035.34757: stdout chunk (state=3):
>>><<<
  8678 1726773035.34780: _low_level_execute_command() done: rc=0, stdout=
{"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726773034.6610146-8678-201693289049758/source", "md5sum": "8d80fe3f09ba4b9ac8d7fd5e8541a324", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773034.6610146-8678-201693289049758/source", "_original_basename": "tmpfd3mm9j5", "follow": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}}
, stderr=Shared connection to 10.31.47.99 closed.
  8678 1726773035.34820: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/active_profile', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726773034.6610146-8678-201693289049758/source', '_original_basename': 'tmpfd3mm9j5', 'follow': False, 'checksum': 'a79569d3860cb6a066e0e92c8b22ffd0e8796bfd', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773034.6610146-8678-201693289049758/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  8678 1726773035.34834: _low_level_execute_command(): starting
  8678 1726773035.34841: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773034.6610146-8678-201693289049758/ > /dev/null 2>&1 && sleep 0'
  8678 1726773035.37557: stderr chunk (state=2):
>>><<<
  8678 1726773035.37569: stdout chunk (state=2):
>>><<<
  8678 1726773035.37583: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8678 1726773035.37590: handler run complete
  8678 1726773035.37607: attempt loop complete, returning result
  8678 1726773035.37610: _execute() done
  8678 1726773035.37612: dumping result to json
  8678 1726773035.37615: done dumping result, returning
  8678 1726773035.37621: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile [0affffe7-6841-6cfb-81ae-000000000032]
  8678 1726773035.37625: sending task result for task 0affffe7-6841-6cfb-81ae-000000000032
  8678 1726773035.37656: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000032
  8678 1726773035.37659: WORKER PROCESS EXITING
changed: [managed_node3] => {
    "changed": true,
    "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd",
    "dest": "/etc/tuned/active_profile",
    "gid": 0,
    "group": "root",
    "md5sum": "8d80fe3f09ba4b9ac8d7fd5e8541a324",
    "mode": "0600",
    "owner": "root",
    "secontext": "system_u:object_r:tuned_rw_etc_t:s0",
    "size": 30,
    "src": "/root/.ansible/tmp/ansible-tmp-1726773034.6610146-8678-201693289049758/source",
    "state": "file",
    "uid": 0
}
  8303 1726773035.37860: no more pending results, returning what we have
  8303 1726773035.37863: results queue empty
  8303 1726773035.37864: checking for any_errors_fatal
  8303 1726773035.37868: done checking for any_errors_fatal
  8303 1726773035.37869: checking for max_fail_percentage
  8303 1726773035.37870: done checking for max_fail_percentage
  8303 1726773035.37871: checking to see if all hosts have failed and the running result is not ok
  8303 1726773035.37871: done checking to see if all hosts have failed
  8303 1726773035.37872: getting the remaining hosts for this loop
  8303 1726773035.37873: done getting the remaining hosts for this loop
  8303 1726773035.37876: getting the next task for host managed_node3
  8303 1726773035.37881: done getting next task for host managed_node3
  8303 1726773035.37884:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual
  8303 1726773035.37887:  ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773035.37896: getting variables
  8303 1726773035.37897: in VariableManager get_vars()
  8303 1726773035.37926: Calling all_inventory to load vars for managed_node3
  8303 1726773035.37929: Calling groups_inventory to load vars for managed_node3
  8303 1726773035.37931: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773035.37939: Calling all_plugins_play to load vars for managed_node3
  8303 1726773035.37942: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773035.37944: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773035.37990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773035.38018: done with get_vars()
  8303 1726773035.38024: done getting variables
  8303 1726773035.38064: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Set profile_mode to manual] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99
Thursday 19 September 2024  15:10:35 -0400 (0:00:00.759)       0:00:11.959 **** 
  8303 1726773035.38090: entering _queue_task() for managed_node3/copy
  8303 1726773035.38253: worker is 1 (out of 1 available)
  8303 1726773035.38266: exiting _queue_task() for managed_node3/copy
  8303 1726773035.38277: done queuing things up, now waiting for results queue to drain
  8303 1726773035.38279: waiting for pending results...
  8708 1726773035.38393: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual
  8708 1726773035.38492: in run() - task 0affffe7-6841-6cfb-81ae-000000000033
  8708 1726773035.38508: variable 'ansible_search_path' from source: unknown
  8708 1726773035.38511: variable 'ansible_search_path' from source: unknown
  8708 1726773035.38542: calling self._execute()
  8708 1726773035.38594: variable 'ansible_host' from source: host vars for 'managed_node3'
  8708 1726773035.38602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8708 1726773035.38611: variable 'omit' from source: magic vars
  8708 1726773035.38683: variable 'omit' from source: magic vars
  8708 1726773035.38717: variable 'omit' from source: magic vars
  8708 1726773035.38738: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars
  8708 1726773035.39015: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars
  8708 1726773035.39077: variable '__kernel_settings_tuned_dir' from source: role '' all vars
  8708 1726773035.39106: variable 'omit' from source: magic vars
  8708 1726773035.39147: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  8708 1726773035.39181: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  8708 1726773035.39201: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  8708 1726773035.39215: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8708 1726773035.39226: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8708 1726773035.39253: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  8708 1726773035.39259: variable 'ansible_host' from source: host vars for 'managed_node3'
  8708 1726773035.39263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8708 1726773035.39360: Set connection var ansible_pipelining to False
  8708 1726773035.39375: Set connection var ansible_timeout to 10
  8708 1726773035.39383: Set connection var ansible_module_compression to ZIP_DEFLATED
  8708 1726773035.39394: Set connection var ansible_shell_executable to /bin/sh
  8708 1726773035.39398: Set connection var ansible_connection to ssh
  8708 1726773035.39403: Set connection var ansible_shell_type to sh
  8708 1726773035.39416: variable 'ansible_shell_executable' from source: unknown
  8708 1726773035.39419: variable 'ansible_connection' from source: unknown
  8708 1726773035.39420: variable 'ansible_module_compression' from source: unknown
  8708 1726773035.39422: variable 'ansible_shell_type' from source: unknown
  8708 1726773035.39423: variable 'ansible_shell_executable' from source: unknown
  8708 1726773035.39425: variable 'ansible_host' from source: host vars for 'managed_node3'
  8708 1726773035.39427: variable 'ansible_pipelining' from source: unknown
  8708 1726773035.39428: variable 'ansible_timeout' from source: unknown
  8708 1726773035.39430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8708 1726773035.39541: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  8708 1726773035.39552: variable 'omit' from source: magic vars
  8708 1726773035.39557: starting attempt loop
  8708 1726773035.39560: running the handler
  8708 1726773035.39572: _low_level_execute_command(): starting
  8708 1726773035.39579: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  8708 1726773035.42329: stdout chunk (state=2):
>>>/root
<<<
  8708 1726773035.42456: stderr chunk (state=3):
>>><<<
  8708 1726773035.42466: stdout chunk (state=3):
>>><<<
  8708 1726773035.42492: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  8708 1726773035.42507: _low_level_execute_command(): starting
  8708 1726773035.42513: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773035.4250095-8708-19701636341455 `" && echo ansible-tmp-1726773035.4250095-8708-19701636341455="` echo /root/.ansible/tmp/ansible-tmp-1726773035.4250095-8708-19701636341455 `" ) && sleep 0'
  8708 1726773035.45006: stdout chunk (state=2):
>>>ansible-tmp-1726773035.4250095-8708-19701636341455=/root/.ansible/tmp/ansible-tmp-1726773035.4250095-8708-19701636341455
<<<
  8708 1726773035.45132: stderr chunk (state=3):
>>><<<
  8708 1726773035.45141: stdout chunk (state=3):
>>><<<
  8708 1726773035.45157: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773035.4250095-8708-19701636341455=/root/.ansible/tmp/ansible-tmp-1726773035.4250095-8708-19701636341455
, stderr=
  8708 1726773035.45234: variable 'ansible_module_compression' from source: unknown
  8708 1726773035.45278: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED
  8708 1726773035.45308: variable 'ansible_facts' from source: unknown
  8708 1726773035.45378: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773035.4250095-8708-19701636341455/AnsiballZ_stat.py
  8708 1726773035.45474: Sending initial data
  8708 1726773035.45482: Sent initial data (150 bytes)
  8708 1726773035.48061: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpbbz0vsuf /root/.ansible/tmp/ansible-tmp-1726773035.4250095-8708-19701636341455/AnsiballZ_stat.py
<<<
  8708 1726773035.49266: stderr chunk (state=3):
>>><<<
  8708 1726773035.49277: stdout chunk (state=3):
>>><<<
  8708 1726773035.49302: done transferring module to remote
  8708 1726773035.49315: _low_level_execute_command(): starting
  8708 1726773035.49321: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773035.4250095-8708-19701636341455/ /root/.ansible/tmp/ansible-tmp-1726773035.4250095-8708-19701636341455/AnsiballZ_stat.py && sleep 0'
  8708 1726773035.51753: stderr chunk (state=2):
>>><<<
  8708 1726773035.51767: stdout chunk (state=2):
>>><<<
  8708 1726773035.51784: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8708 1726773035.51792: _low_level_execute_command(): starting
  8708 1726773035.51798: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773035.4250095-8708-19701636341455/AnsiballZ_stat.py && sleep 0'
  8708 1726773035.67940: stdout chunk (state=2):
>>>
{"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 5, "inode": 917923, "dev": 51713, "nlink": 1, "atime": 1726772617.7804716, "mtime": 1726772618.4714715, "ctime": 1726772618.4714715, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "mimetype": "text/plain", "charset": "us-ascii", "version": "3852760320", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}}
<<<
  8708 1726773035.69152: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  8708 1726773035.69212: stderr chunk (state=3):
>>><<<
  8708 1726773035.69221: stdout chunk (state=3):
>>><<<
  8708 1726773035.69237: _low_level_execute_command() done: rc=0, stdout=
{"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 5, "inode": 917923, "dev": 51713, "nlink": 1, "atime": 1726772617.7804716, "mtime": 1726772618.4714715, "ctime": 1726772618.4714715, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "mimetype": "text/plain", "charset": "us-ascii", "version": "3852760320", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}}
, stderr=Shared connection to 10.31.47.99 closed.
  8708 1726773035.69290: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773035.4250095-8708-19701636341455/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  8708 1726773035.69384: Sending initial data
  8708 1726773035.69394: Sent initial data (139 bytes)
  8708 1726773035.72075: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpdkexsdx5 /root/.ansible/tmp/ansible-tmp-1726773035.4250095-8708-19701636341455/source
<<<
  8708 1726773035.72557: stderr chunk (state=3):
>>><<<
  8708 1726773035.72571: stdout chunk (state=3):
>>><<<
  8708 1726773035.72595: _low_level_execute_command(): starting
  8708 1726773035.72602: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773035.4250095-8708-19701636341455/ /root/.ansible/tmp/ansible-tmp-1726773035.4250095-8708-19701636341455/source && sleep 0'
  8708 1726773035.75157: stderr chunk (state=2):
>>><<<
  8708 1726773035.75173: stdout chunk (state=2):
>>><<<
  8708 1726773035.75196: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8708 1726773035.75222: variable 'ansible_module_compression' from source: unknown
  8708 1726773035.75277: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED
  8708 1726773035.75299: variable 'ansible_facts' from source: unknown
  8708 1726773035.75374: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773035.4250095-8708-19701636341455/AnsiballZ_copy.py
  8708 1726773035.75873: Sending initial data
  8708 1726773035.75880: Sent initial data (150 bytes)
  8708 1726773035.78854: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmp81jlicmq /root/.ansible/tmp/ansible-tmp-1726773035.4250095-8708-19701636341455/AnsiballZ_copy.py
<<<
  8708 1726773035.82071: stderr chunk (state=3):
>>><<<
  8708 1726773035.82083: stdout chunk (state=3):
>>><<<
  8708 1726773035.82109: done transferring module to remote
  8708 1726773035.82124: _low_level_execute_command(): starting
  8708 1726773035.82130: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773035.4250095-8708-19701636341455/ /root/.ansible/tmp/ansible-tmp-1726773035.4250095-8708-19701636341455/AnsiballZ_copy.py && sleep 0'
  8708 1726773035.84834: stderr chunk (state=2):
>>><<<
  8708 1726773035.84845: stdout chunk (state=2):
>>><<<
  8708 1726773035.84863: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8708 1726773035.84874: _low_level_execute_command(): starting
  8708 1726773035.84881: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773035.4250095-8708-19701636341455/AnsiballZ_copy.py && sleep 0'
  8708 1726773036.01883: stdout chunk (state=2):
>>>
{"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726773035.4250095-8708-19701636341455/source", "md5sum": "cf3f2a865fbea819dadd439586eaee31", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773035.4250095-8708-19701636341455/source", "_original_basename": "tmpdkexsdx5", "follow": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}}
<<<
  8708 1726773036.03090: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  8708 1726773036.03102: stdout chunk (state=3):
>>><<<
  8708 1726773036.03114: stderr chunk (state=3):
>>><<<
  8708 1726773036.03129: _low_level_execute_command() done: rc=0, stdout=
{"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726773035.4250095-8708-19701636341455/source", "md5sum": "cf3f2a865fbea819dadd439586eaee31", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773035.4250095-8708-19701636341455/source", "_original_basename": "tmpdkexsdx5", "follow": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}}
, stderr=Shared connection to 10.31.47.99 closed.
  8708 1726773036.03165: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/profile_mode', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726773035.4250095-8708-19701636341455/source', '_original_basename': 'tmpdkexsdx5', 'follow': False, 'checksum': '3ef9f23deed2e23d3ef2b88b842fb882313e15ce', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773035.4250095-8708-19701636341455/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  8708 1726773036.03179: _low_level_execute_command(): starting
  8708 1726773036.03187: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773035.4250095-8708-19701636341455/ > /dev/null 2>&1 && sleep 0'
  8708 1726773036.06360: stderr chunk (state=2):
>>><<<
  8708 1726773036.06374: stdout chunk (state=2):
>>><<<
  8708 1726773036.06396: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8708 1726773036.06405: handler run complete
  8708 1726773036.06431: attempt loop complete, returning result
  8708 1726773036.06436: _execute() done
  8708 1726773036.06440: dumping result to json
  8708 1726773036.06445: done dumping result, returning
  8708 1726773036.06454: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual [0affffe7-6841-6cfb-81ae-000000000033]
  8708 1726773036.06460: sending task result for task 0affffe7-6841-6cfb-81ae-000000000033
  8708 1726773036.06504: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000033
  8708 1726773036.06509: WORKER PROCESS EXITING
changed: [managed_node3] => {
    "changed": true,
    "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce",
    "dest": "/etc/tuned/profile_mode",
    "gid": 0,
    "group": "root",
    "md5sum": "cf3f2a865fbea819dadd439586eaee31",
    "mode": "0600",
    "owner": "root",
    "secontext": "system_u:object_r:tuned_etc_t:s0",
    "size": 7,
    "src": "/root/.ansible/tmp/ansible-tmp-1726773035.4250095-8708-19701636341455/source",
    "state": "file",
    "uid": 0
}
  8303 1726773036.07179: no more pending results, returning what we have
  8303 1726773036.07182: results queue empty
  8303 1726773036.07183: checking for any_errors_fatal
  8303 1726773036.07195: done checking for any_errors_fatal
  8303 1726773036.07195: checking for max_fail_percentage
  8303 1726773036.07197: done checking for max_fail_percentage
  8303 1726773036.07198: checking to see if all hosts have failed and the running result is not ok
  8303 1726773036.07198: done checking to see if all hosts have failed
  8303 1726773036.07199: getting the remaining hosts for this loop
  8303 1726773036.07200: done getting the remaining hosts for this loop
  8303 1726773036.07207: getting the next task for host managed_node3
  8303 1726773036.07212: done getting next task for host managed_node3
  8303 1726773036.07215:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config
  8303 1726773036.07218:  ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773036.07228: getting variables
  8303 1726773036.07229: in VariableManager get_vars()
  8303 1726773036.07263: Calling all_inventory to load vars for managed_node3
  8303 1726773036.07265: Calling groups_inventory to load vars for managed_node3
  8303 1726773036.07267: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773036.07277: Calling all_plugins_play to load vars for managed_node3
  8303 1726773036.07280: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773036.07282: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773036.07335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773036.07377: done with get_vars()
  8303 1726773036.07388: done getting variables

TASK [fedora.linux_system_roles.kernel_settings : Get current config] **********
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:107
Thursday 19 September 2024  15:10:36 -0400 (0:00:00.693)       0:00:12.653 **** 
  8303 1726773036.07468: entering _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config
  8303 1726773036.07739: worker is 1 (out of 1 available)
  8303 1726773036.07753: exiting _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config
  8303 1726773036.07763: done queuing things up, now waiting for results queue to drain
  8303 1726773036.07764: waiting for pending results...
  8740 1726773036.08379: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get current config
  8740 1726773036.08501: in run() - task 0affffe7-6841-6cfb-81ae-000000000034
  8740 1726773036.08518: variable 'ansible_search_path' from source: unknown
  8740 1726773036.08521: variable 'ansible_search_path' from source: unknown
  8740 1726773036.08553: calling self._execute()
  8740 1726773036.08613: variable 'ansible_host' from source: host vars for 'managed_node3'
  8740 1726773036.08622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8740 1726773036.08629: variable 'omit' from source: magic vars
  8740 1726773036.08721: variable 'omit' from source: magic vars
  8740 1726773036.08764: variable 'omit' from source: magic vars
  8740 1726773036.08789: variable '__kernel_settings_profile_filename' from source: role '' all vars
  8740 1726773036.09060: variable '__kernel_settings_profile_filename' from source: role '' all vars
  8740 1726773036.09139: variable '__kernel_settings_profile_dir' from source: role '' all vars
  8740 1726773036.09222: variable '__kernel_settings_profile_parent' from source: set_fact
  8740 1726773036.09231: variable '__kernel_settings_tuned_profile' from source: role '' all vars
  8740 1726773036.09273: variable 'omit' from source: magic vars
  8740 1726773036.09371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  8740 1726773036.09414: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  8740 1726773036.09435: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  8740 1726773036.09451: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8740 1726773036.09463: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8740 1726773036.09493: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  8740 1726773036.09498: variable 'ansible_host' from source: host vars for 'managed_node3'
  8740 1726773036.09502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8740 1726773036.09602: Set connection var ansible_pipelining to False
  8740 1726773036.09613: Set connection var ansible_timeout to 10
  8740 1726773036.09619: Set connection var ansible_module_compression to ZIP_DEFLATED
  8740 1726773036.09625: Set connection var ansible_shell_executable to /bin/sh
  8740 1726773036.09628: Set connection var ansible_connection to ssh
  8740 1726773036.09635: Set connection var ansible_shell_type to sh
  8740 1726773036.09655: variable 'ansible_shell_executable' from source: unknown
  8740 1726773036.09660: variable 'ansible_connection' from source: unknown
  8740 1726773036.09663: variable 'ansible_module_compression' from source: unknown
  8740 1726773036.09666: variable 'ansible_shell_type' from source: unknown
  8740 1726773036.09668: variable 'ansible_shell_executable' from source: unknown
  8740 1726773036.09671: variable 'ansible_host' from source: host vars for 'managed_node3'
  8740 1726773036.09675: variable 'ansible_pipelining' from source: unknown
  8740 1726773036.09678: variable 'ansible_timeout' from source: unknown
  8740 1726773036.09681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8740 1726773036.09845: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action)
  8740 1726773036.09855: variable 'omit' from source: magic vars
  8740 1726773036.09861: starting attempt loop
  8740 1726773036.09863: running the handler
  8740 1726773036.09874: _low_level_execute_command(): starting
  8740 1726773036.09881: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  8740 1726773036.13092: stdout chunk (state=2):
>>>/root
<<<
  8740 1726773036.13104: stderr chunk (state=2):
>>><<<
  8740 1726773036.13116: stdout chunk (state=3):
>>><<<
  8740 1726773036.13130: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  8740 1726773036.13144: _low_level_execute_command(): starting
  8740 1726773036.13150: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773036.1313884-8740-212359277273857 `" && echo ansible-tmp-1726773036.1313884-8740-212359277273857="` echo /root/.ansible/tmp/ansible-tmp-1726773036.1313884-8740-212359277273857 `" ) && sleep 0'
  8740 1726773036.16561: stdout chunk (state=2):
>>>ansible-tmp-1726773036.1313884-8740-212359277273857=/root/.ansible/tmp/ansible-tmp-1726773036.1313884-8740-212359277273857
<<<
  8740 1726773036.16633: stderr chunk (state=3):
>>><<<
  8740 1726773036.16641: stdout chunk (state=3):
>>><<<
  8740 1726773036.16660: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773036.1313884-8740-212359277273857=/root/.ansible/tmp/ansible-tmp-1726773036.1313884-8740-212359277273857
, stderr=
  8740 1726773036.16700: variable 'ansible_module_compression' from source: unknown
  8740 1726773036.16732: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED
  8740 1726773036.16769: variable 'ansible_facts' from source: unknown
  8740 1726773036.16842: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773036.1313884-8740-212359277273857/AnsiballZ_kernel_settings_get_config.py
  8740 1726773036.16942: Sending initial data
  8740 1726773036.16949: Sent initial data (173 bytes)
  8740 1726773036.19802: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpn39ngxqa /root/.ansible/tmp/ansible-tmp-1726773036.1313884-8740-212359277273857/AnsiballZ_kernel_settings_get_config.py
<<<
  8740 1726773036.23247: stderr chunk (state=3):
>>><<<
  8740 1726773036.23258: stdout chunk (state=3):
>>><<<
  8740 1726773036.23287: done transferring module to remote
  8740 1726773036.23302: _low_level_execute_command(): starting
  8740 1726773036.23307: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773036.1313884-8740-212359277273857/ /root/.ansible/tmp/ansible-tmp-1726773036.1313884-8740-212359277273857/AnsiballZ_kernel_settings_get_config.py && sleep 0'
  8740 1726773036.26560: stderr chunk (state=2):
>>><<<
  8740 1726773036.26575: stdout chunk (state=2):
>>><<<
  8740 1726773036.26594: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8740 1726773036.26602: _low_level_execute_command(): starting
  8740 1726773036.26610: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773036.1313884-8740-212359277273857/AnsiballZ_kernel_settings_get_config.py && sleep 0'
  8740 1726773036.44327: stdout chunk (state=2):
>>>
{"changed": false, "data": {}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}}
<<<
  8740 1726773036.45473: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  8740 1726773036.45524: stderr chunk (state=3):
>>><<<
  8740 1726773036.45532: stdout chunk (state=3):
>>><<<
  8740 1726773036.45548: _low_level_execute_command() done: rc=0, stdout=
{"changed": false, "data": {}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}}
, stderr=Shared connection to 10.31.47.99 closed.
  8740 1726773036.45569: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773036.1313884-8740-212359277273857/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  8740 1726773036.45580: _low_level_execute_command(): starting
  8740 1726773036.45587: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773036.1313884-8740-212359277273857/ > /dev/null 2>&1 && sleep 0'
  8740 1726773036.48086: stderr chunk (state=2):
>>><<<
  8740 1726773036.48097: stdout chunk (state=2):
>>><<<
  8740 1726773036.48114: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8740 1726773036.48122: handler run complete
  8740 1726773036.48136: attempt loop complete, returning result
  8740 1726773036.48139: _execute() done
  8740 1726773036.48143: dumping result to json
  8740 1726773036.48146: done dumping result, returning
  8740 1726773036.48154: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get current config [0affffe7-6841-6cfb-81ae-000000000034]
  8740 1726773036.48160: sending task result for task 0affffe7-6841-6cfb-81ae-000000000034
  8740 1726773036.48193: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000034
  8740 1726773036.48197: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "changed": false,
    "data": {}
}
  8303 1726773036.48320: no more pending results, returning what we have
  8303 1726773036.48323: results queue empty
  8303 1726773036.48324: checking for any_errors_fatal
  8303 1726773036.48329: done checking for any_errors_fatal
  8303 1726773036.48330: checking for max_fail_percentage
  8303 1726773036.48331: done checking for max_fail_percentage
  8303 1726773036.48332: checking to see if all hosts have failed and the running result is not ok
  8303 1726773036.48332: done checking to see if all hosts have failed
  8303 1726773036.48333: getting the remaining hosts for this loop
  8303 1726773036.48334: done getting the remaining hosts for this loop
  8303 1726773036.48338: getting the next task for host managed_node3
  8303 1726773036.48343: done getting next task for host managed_node3
  8303 1726773036.48346:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings
  8303 1726773036.48348:  ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773036.48357: getting variables
  8303 1726773036.48358: in VariableManager get_vars()
  8303 1726773036.48390: Calling all_inventory to load vars for managed_node3
  8303 1726773036.48392: Calling groups_inventory to load vars for managed_node3
  8303 1726773036.48394: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773036.48403: Calling all_plugins_play to load vars for managed_node3
  8303 1726773036.48405: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773036.48408: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773036.48453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773036.48483: done with get_vars()
  8303 1726773036.48492: done getting variables
  8303 1726773036.48572: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Apply kernel settings] *******
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112
Thursday 19 September 2024  15:10:36 -0400 (0:00:00.411)       0:00:13.064 **** 
  8303 1726773036.48597: entering _queue_task() for managed_node3/template
  8303 1726773036.48599: Creating lock for template
  8303 1726773036.48772: worker is 1 (out of 1 available)
  8303 1726773036.48789: exiting _queue_task() for managed_node3/template
  8303 1726773036.48803: done queuing things up, now waiting for results queue to drain
  8303 1726773036.48804: waiting for pending results...
  8771 1726773036.48918: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings
  8771 1726773036.49023: in run() - task 0affffe7-6841-6cfb-81ae-000000000035
  8771 1726773036.49041: variable 'ansible_search_path' from source: unknown
  8771 1726773036.49045: variable 'ansible_search_path' from source: unknown
  8771 1726773036.49077: calling self._execute()
  8771 1726773036.49132: variable 'ansible_host' from source: host vars for 'managed_node3'
  8771 1726773036.49143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8771 1726773036.49152: variable 'omit' from source: magic vars
  8771 1726773036.49228: variable 'omit' from source: magic vars
  8771 1726773036.49262: variable 'omit' from source: magic vars
  8771 1726773036.49508: variable '__kernel_settings_profile_src' from source: role '' all vars
  8771 1726773036.49518: variable '__kernel_settings_tuned_profile' from source: role '' all vars
  8771 1726773036.49579: variable '__kernel_settings_tuned_profile' from source: role '' all vars
  8771 1726773036.49601: variable '__kernel_settings_profile_filename' from source: role '' all vars
  8771 1726773036.49648: variable '__kernel_settings_profile_filename' from source: role '' all vars
  8771 1726773036.49701: variable '__kernel_settings_profile_dir' from source: role '' all vars
  8771 1726773036.49762: variable '__kernel_settings_profile_parent' from source: set_fact
  8771 1726773036.49773: variable '__kernel_settings_tuned_profile' from source: role '' all vars
  8771 1726773036.49800: variable 'omit' from source: magic vars
  8771 1726773036.49834: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  8771 1726773036.49861: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  8771 1726773036.49882: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  8771 1726773036.49897: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8771 1726773036.49907: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8771 1726773036.49927: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  8771 1726773036.49930: variable 'ansible_host' from source: host vars for 'managed_node3'
  8771 1726773036.49933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8771 1726773036.50004: Set connection var ansible_pipelining to False
  8771 1726773036.50012: Set connection var ansible_timeout to 10
  8771 1726773036.50015: Set connection var ansible_module_compression to ZIP_DEFLATED
  8771 1726773036.50019: Set connection var ansible_shell_executable to /bin/sh
  8771 1726773036.50021: Set connection var ansible_connection to ssh
  8771 1726773036.50025: Set connection var ansible_shell_type to sh
  8771 1726773036.50038: variable 'ansible_shell_executable' from source: unknown
  8771 1726773036.50040: variable 'ansible_connection' from source: unknown
  8771 1726773036.50042: variable 'ansible_module_compression' from source: unknown
  8771 1726773036.50044: variable 'ansible_shell_type' from source: unknown
  8771 1726773036.50045: variable 'ansible_shell_executable' from source: unknown
  8771 1726773036.50047: variable 'ansible_host' from source: host vars for 'managed_node3'
  8771 1726773036.50049: variable 'ansible_pipelining' from source: unknown
  8771 1726773036.50050: variable 'ansible_timeout' from source: unknown
  8771 1726773036.50052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8771 1726773036.50142: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  8771 1726773036.50151: variable 'omit' from source: magic vars
  8771 1726773036.50155: starting attempt loop
  8771 1726773036.50157: running the handler
  8771 1726773036.50164: _low_level_execute_command(): starting
  8771 1726773036.50173: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  8771 1726773036.52539: stdout chunk (state=2):
>>>/root
<<<
  8771 1726773036.52660: stderr chunk (state=3):
>>><<<
  8771 1726773036.52668: stdout chunk (state=3):
>>><<<
  8771 1726773036.52688: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  8771 1726773036.52702: _low_level_execute_command(): starting
  8771 1726773036.52708: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773036.5269482-8771-232847385809802 `" && echo ansible-tmp-1726773036.5269482-8771-232847385809802="` echo /root/.ansible/tmp/ansible-tmp-1726773036.5269482-8771-232847385809802 `" ) && sleep 0'
  8771 1726773036.55226: stdout chunk (state=2):
>>>ansible-tmp-1726773036.5269482-8771-232847385809802=/root/.ansible/tmp/ansible-tmp-1726773036.5269482-8771-232847385809802
<<<
  8771 1726773036.55358: stderr chunk (state=3):
>>><<<
  8771 1726773036.55365: stdout chunk (state=3):
>>><<<
  8771 1726773036.55384: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773036.5269482-8771-232847385809802=/root/.ansible/tmp/ansible-tmp-1726773036.5269482-8771-232847385809802
, stderr=
  8771 1726773036.55402: evaluation_path:
	/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings
	/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks
  8771 1726773036.55421: search_path:
	/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/templates/kernel_settings.j2
	/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/kernel_settings.j2
	/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/templates/kernel_settings.j2
	/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/kernel_settings.j2
	/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/templates/kernel_settings.j2
	/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/kernel_settings.j2
  8771 1726773036.55440: variable 'ansible_search_path' from source: unknown
  8771 1726773036.56166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name
  8771 1726773036.57942: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py
  8771 1726773036.58008: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py
  8771 1726773036.58059: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py
  8771 1726773036.58097: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py
  8771 1726773036.58124: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py
  8771 1726773036.58397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  8771 1726773036.58424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  8771 1726773036.58448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  8771 1726773036.58480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  8771 1726773036.58494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  8771 1726773036.58815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  8771 1726773036.58841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  8771 1726773036.58865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  8771 1726773036.58916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  8771 1726773036.58931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  8771 1726773036.59332: variable 'ansible_managed' from source: unknown
  8771 1726773036.59340: variable '__sections' from source: task vars
  8771 1726773036.59479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  8771 1726773036.59506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  8771 1726773036.59531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  8771 1726773036.59571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  8771 1726773036.59588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  8771 1726773036.59687: variable 'kernel_settings_sysctl' from source: role '' defaults
  8771 1726773036.59694: variable '__kernel_settings_state_empty' from source: role '' all vars
  8771 1726773036.59701: variable '__kernel_settings_previous_replaced' from source: role '' all vars
  8771 1726773036.59737: variable '__sysctl_old' from source: task vars
  8771 1726773036.59801: variable '__sysctl_old' from source: task vars
  8771 1726773036.60072: variable 'kernel_settings_purge' from source: role '' defaults
  8771 1726773036.60079: variable 'kernel_settings_sysctl' from source: role '' defaults
  8771 1726773036.60084: variable '__kernel_settings_state_empty' from source: role '' all vars
  8771 1726773036.60091: variable '__kernel_settings_previous_replaced' from source: role '' all vars
  8771 1726773036.60096: variable '__kernel_settings_profile_contents' from source: set_fact
  8771 1726773036.60289: variable 'kernel_settings_sysfs' from source: role '' defaults
  8771 1726773036.60296: variable '__kernel_settings_state_empty' from source: role '' all vars
  8771 1726773036.60302: variable '__kernel_settings_previous_replaced' from source: role '' all vars
  8771 1726773036.60318: variable '__sysfs_old' from source: task vars
  8771 1726773036.60375: variable '__sysfs_old' from source: task vars
  8771 1726773036.60582: variable 'kernel_settings_purge' from source: role '' defaults
  8771 1726773036.60591: variable 'kernel_settings_sysfs' from source: role '' defaults
  8771 1726773036.60596: variable '__kernel_settings_state_empty' from source: role '' all vars
  8771 1726773036.60600: variable '__kernel_settings_previous_replaced' from source: role '' all vars
  8771 1726773036.60604: variable '__kernel_settings_profile_contents' from source: set_fact
  8771 1726773036.60621: variable 'kernel_settings_systemd_cpu_affinity' from source: role '' defaults
  8771 1726773036.60628: variable '__systemd_old' from source: task vars
  8771 1726773036.60679: variable '__systemd_old' from source: task vars
  8771 1726773036.60872: variable 'kernel_settings_purge' from source: role '' defaults
  8771 1726773036.60877: variable 'kernel_settings_systemd_cpu_affinity' from source: role '' defaults
  8771 1726773036.60880: variable '__kernel_settings_state_absent' from source: role '' all vars
  8771 1726773036.60884: variable '__kernel_settings_profile_contents' from source: set_fact
  8771 1726773036.60896: variable 'kernel_settings_transparent_hugepages' from source: role '' defaults
  8771 1726773036.60902: variable 'kernel_settings_transparent_hugepages_defrag' from source: role '' defaults
  8771 1726773036.60906: variable '__trans_huge_old' from source: task vars
  8771 1726773036.60953: variable '__trans_huge_old' from source: task vars
  8771 1726773036.61090: variable 'kernel_settings_purge' from source: role '' defaults
  8771 1726773036.61098: variable 'kernel_settings_transparent_hugepages' from source: role '' defaults
  8771 1726773036.61103: variable '__kernel_settings_state_absent' from source: role '' all vars
  8771 1726773036.61107: variable '__kernel_settings_profile_contents' from source: set_fact
  8771 1726773036.61117: variable '__trans_defrag_old' from source: task vars
  8771 1726773036.61178: variable '__trans_defrag_old' from source: task vars
  8771 1726773036.61824: variable 'kernel_settings_purge' from source: role '' defaults
  8771 1726773036.61832: variable 'kernel_settings_transparent_hugepages_defrag' from source: role '' defaults
  8771 1726773036.61837: variable '__kernel_settings_state_absent' from source: role '' all vars
  8771 1726773036.61842: variable '__kernel_settings_profile_contents' from source: set_fact
  8771 1726773036.61859: variable '__kernel_settings_state_absent' from source: role '' all vars
  8771 1726773036.61869: variable '__kernel_settings_state_absent' from source: role '' all vars
  8771 1726773036.61874: variable '__kernel_settings_state_absent' from source: role '' all vars
  8771 1726773036.62352: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  8771 1726773036.62398: variable 'ansible_module_compression' from source: unknown
  8771 1726773036.62436: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED
  8771 1726773036.62457: variable 'ansible_facts' from source: unknown
  8771 1726773036.62523: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773036.5269482-8771-232847385809802/AnsiballZ_stat.py
  8771 1726773036.62615: Sending initial data
  8771 1726773036.62622: Sent initial data (151 bytes)
  8771 1726773036.65288: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpxb9vdrxn /root/.ansible/tmp/ansible-tmp-1726773036.5269482-8771-232847385809802/AnsiballZ_stat.py
<<<
  8771 1726773036.66874: stderr chunk (state=3):
>>><<<
  8771 1726773036.66887: stdout chunk (state=3):
>>><<<
  8771 1726773036.66911: done transferring module to remote
  8771 1726773036.66925: _low_level_execute_command(): starting
  8771 1726773036.66931: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773036.5269482-8771-232847385809802/ /root/.ansible/tmp/ansible-tmp-1726773036.5269482-8771-232847385809802/AnsiballZ_stat.py && sleep 0'
  8771 1726773036.70523: stderr chunk (state=2):
>>><<<
  8771 1726773036.70535: stdout chunk (state=2):
>>><<<
  8771 1726773036.70552: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8771 1726773036.70558: _low_level_execute_command(): starting
  8771 1726773036.70564: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773036.5269482-8771-232847385809802/AnsiballZ_stat.py && sleep 0'
  8771 1726773036.85825: stdout chunk (state=2):
>>>
{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}}
<<<
  8771 1726773036.86830: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  8771 1726773036.86882: stderr chunk (state=3):
>>><<<
  8771 1726773036.86891: stdout chunk (state=3):
>>><<<
  8771 1726773036.86907: _low_level_execute_command() done: rc=0, stdout=
{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}}
, stderr=Shared connection to 10.31.47.99 closed.
  8771 1726773036.86930: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/kernel_settings/tuned.conf', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773036.5269482-8771-232847385809802/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  8771 1726773036.87019: Sending initial data
  8771 1726773036.87026: Sent initial data (159 bytes)
  8771 1726773036.89658: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmp6tqmuhi2/kernel_settings.j2 /root/.ansible/tmp/ansible-tmp-1726773036.5269482-8771-232847385809802/source
<<<
  8771 1726773036.90107: stderr chunk (state=3):
>>><<<
  8771 1726773036.90118: stdout chunk (state=3):
>>><<<
  8771 1726773036.90133: _low_level_execute_command(): starting
  8771 1726773036.90139: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773036.5269482-8771-232847385809802/ /root/.ansible/tmp/ansible-tmp-1726773036.5269482-8771-232847385809802/source && sleep 0'
  8771 1726773036.92577: stderr chunk (state=2):
>>><<<
  8771 1726773036.92588: stdout chunk (state=2):
>>><<<
  8771 1726773036.92605: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8771 1726773036.92626: variable 'ansible_module_compression' from source: unknown
  8771 1726773036.92660: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED
  8771 1726773036.92683: variable 'ansible_facts' from source: unknown
  8771 1726773036.92742: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773036.5269482-8771-232847385809802/AnsiballZ_copy.py
  8771 1726773036.92859: Sending initial data
  8771 1726773036.92870: Sent initial data (151 bytes)
  8771 1726773036.95789: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpawhpk9yf /root/.ansible/tmp/ansible-tmp-1726773036.5269482-8771-232847385809802/AnsiballZ_copy.py
<<<
  8771 1726773036.97378: stderr chunk (state=3):
>>><<<
  8771 1726773036.97392: stdout chunk (state=3):
>>><<<
  8771 1726773036.97416: done transferring module to remote
  8771 1726773036.97427: _low_level_execute_command(): starting
  8771 1726773036.97433: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773036.5269482-8771-232847385809802/ /root/.ansible/tmp/ansible-tmp-1726773036.5269482-8771-232847385809802/AnsiballZ_copy.py && sleep 0'
  8771 1726773037.00124: stderr chunk (state=2):
>>><<<
  8771 1726773037.00136: stdout chunk (state=2):
>>><<<
  8771 1726773037.00154: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8771 1726773037.00159: _low_level_execute_command(): starting
  8771 1726773037.00168: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773036.5269482-8771-232847385809802/AnsiballZ_copy.py && sleep 0'
  8771 1726773037.16906: stdout chunk (state=2):
>>>
{"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773036.5269482-8771-232847385809802/source", "md5sum": "7d83891795eeb6debeff7e2812501630", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773036.5269482-8771-232847385809802/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}}
<<<
  8771 1726773037.18105: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  8771 1726773037.18154: stderr chunk (state=3):
>>><<<
  8771 1726773037.18161: stdout chunk (state=3):
>>><<<
  8771 1726773037.18177: _low_level_execute_command() done: rc=0, stdout=
{"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773036.5269482-8771-232847385809802/source", "md5sum": "7d83891795eeb6debeff7e2812501630", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773036.5269482-8771-232847385809802/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}}
, stderr=Shared connection to 10.31.47.99 closed.
  8771 1726773037.18207: done with _execute_module (ansible.legacy.copy, {'src': '/root/.ansible/tmp/ansible-tmp-1726773036.5269482-8771-232847385809802/source', 'dest': '/etc/tuned/kernel_settings/tuned.conf', 'mode': '0644', 'follow': False, '_original_basename': 'kernel_settings.j2', 'checksum': 'e44ba7fc7046252a1b6772f7347d0e7b9b48a069', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773036.5269482-8771-232847385809802/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  8771 1726773037.18239: _low_level_execute_command(): starting
  8771 1726773037.18247: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773036.5269482-8771-232847385809802/ > /dev/null 2>&1 && sleep 0'
  8771 1726773037.20732: stderr chunk (state=2):
>>><<<
  8771 1726773037.20742: stdout chunk (state=2):
>>><<<
  8771 1726773037.20757: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8771 1726773037.20767: handler run complete
  8771 1726773037.20788: attempt loop complete, returning result
  8771 1726773037.20793: _execute() done
  8771 1726773037.20796: dumping result to json
  8771 1726773037.20802: done dumping result, returning
  8771 1726773037.20812: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings [0affffe7-6841-6cfb-81ae-000000000035]
  8771 1726773037.20819: sending task result for task 0affffe7-6841-6cfb-81ae-000000000035
  8771 1726773037.20862: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000035
  8771 1726773037.20865: WORKER PROCESS EXITING
changed: [managed_node3] => {
    "changed": true,
    "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069",
    "dest": "/etc/tuned/kernel_settings/tuned.conf",
    "gid": 0,
    "group": "root",
    "md5sum": "7d83891795eeb6debeff7e2812501630",
    "mode": "0644",
    "owner": "root",
    "secontext": "system_u:object_r:tuned_etc_t:s0",
    "size": 86,
    "src": "/root/.ansible/tmp/ansible-tmp-1726773036.5269482-8771-232847385809802/source",
    "state": "file",
    "uid": 0
}
  8303 1726773037.21064: no more pending results, returning what we have
  8303 1726773037.21071: results queue empty
  8303 1726773037.21071: checking for any_errors_fatal
  8303 1726773037.21077: done checking for any_errors_fatal
  8303 1726773037.21077: checking for max_fail_percentage
  8303 1726773037.21079: done checking for max_fail_percentage
  8303 1726773037.21079: checking to see if all hosts have failed and the running result is not ok
  8303 1726773037.21080: done checking to see if all hosts have failed
  8303 1726773037.21080: getting the remaining hosts for this loop
  8303 1726773037.21081: done getting the remaining hosts for this loop
  8303 1726773037.21086: getting the next task for host managed_node3
  8303 1726773037.21091: done getting next task for host managed_node3
  8303 1726773037.21093:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes
  8303 1726773037.21096:  ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773037.21104: getting variables
  8303 1726773037.21105: in VariableManager get_vars()
  8303 1726773037.21132: Calling all_inventory to load vars for managed_node3
  8303 1726773037.21134: Calling groups_inventory to load vars for managed_node3
  8303 1726773037.21135: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773037.21143: Calling all_plugins_play to load vars for managed_node3
  8303 1726773037.21145: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773037.21146: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773037.21183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773037.21215: done with get_vars()
  8303 1726773037.21221: done getting variables
  8303 1726773037.21260: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149
Thursday 19 September 2024  15:10:37 -0400 (0:00:00.726)       0:00:13.791 **** 
  8303 1726773037.21283: entering _queue_task() for managed_node3/service
  8303 1726773037.21453: worker is 1 (out of 1 available)
  8303 1726773037.21470: exiting _queue_task() for managed_node3/service
  8303 1726773037.21482: done queuing things up, now waiting for results queue to drain
  8303 1726773037.21483: waiting for pending results...
  8817 1726773037.21592: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes
  8817 1726773037.21692: in run() - task 0affffe7-6841-6cfb-81ae-000000000036
  8817 1726773037.21707: variable 'ansible_search_path' from source: unknown
  8817 1726773037.21711: variable 'ansible_search_path' from source: unknown
  8817 1726773037.21747: variable '__kernel_settings_services' from source: include_vars
  8817 1726773037.21973: variable '__kernel_settings_services' from source: include_vars
  8817 1726773037.22032: variable 'omit' from source: magic vars
  8817 1726773037.22108: variable 'ansible_host' from source: host vars for 'managed_node3'
  8817 1726773037.22119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8817 1726773037.22128: variable 'omit' from source: magic vars
  8817 1726773037.22455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name
  8817 1726773037.22624: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py
  8817 1726773037.22656: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py
  8817 1726773037.22683: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py
  8817 1726773037.22713: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py
  8817 1726773037.22780: variable '__kernel_settings_register_profile' from source: set_fact
  8817 1726773037.22796: variable '__kernel_settings_register_mode' from source: set_fact
  8817 1726773037.22813: Evaluated conditional (__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed): True
  8817 1726773037.22820: variable 'omit' from source: magic vars
  8817 1726773037.22847: variable 'omit' from source: magic vars
  8817 1726773037.22876: variable 'item' from source: unknown
  8817 1726773037.22927: variable 'item' from source: unknown
  8817 1726773037.22944: variable 'omit' from source: magic vars
  8817 1726773037.22968: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  8817 1726773037.22991: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  8817 1726773037.23007: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  8817 1726773037.23022: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8817 1726773037.23032: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8817 1726773037.23054: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  8817 1726773037.23059: variable 'ansible_host' from source: host vars for 'managed_node3'
  8817 1726773037.23063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8817 1726773037.23129: Set connection var ansible_pipelining to False
  8817 1726773037.23139: Set connection var ansible_timeout to 10
  8817 1726773037.23146: Set connection var ansible_module_compression to ZIP_DEFLATED
  8817 1726773037.23152: Set connection var ansible_shell_executable to /bin/sh
  8817 1726773037.23155: Set connection var ansible_connection to ssh
  8817 1726773037.23161: Set connection var ansible_shell_type to sh
  8817 1726773037.23176: variable 'ansible_shell_executable' from source: unknown
  8817 1726773037.23179: variable 'ansible_connection' from source: unknown
  8817 1726773037.23183: variable 'ansible_module_compression' from source: unknown
  8817 1726773037.23188: variable 'ansible_shell_type' from source: unknown
  8817 1726773037.23191: variable 'ansible_shell_executable' from source: unknown
  8817 1726773037.23195: variable 'ansible_host' from source: host vars for 'managed_node3'
  8817 1726773037.23199: variable 'ansible_pipelining' from source: unknown
  8817 1726773037.23202: variable 'ansible_timeout' from source: unknown
  8817 1726773037.23206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8817 1726773037.23272: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  8817 1726773037.23282: variable 'omit' from source: magic vars
  8817 1726773037.23291: starting attempt loop
  8817 1726773037.23294: running the handler
  8817 1726773037.23348: variable 'ansible_facts' from source: unknown
  8817 1726773037.23375: _low_level_execute_command(): starting
  8817 1726773037.23381: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  8817 1726773037.25747: stdout chunk (state=2):
>>>/root
<<<
  8817 1726773037.25868: stderr chunk (state=3):
>>><<<
  8817 1726773037.25875: stdout chunk (state=3):
>>><<<
  8817 1726773037.25895: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  8817 1726773037.25908: _low_level_execute_command(): starting
  8817 1726773037.25914: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773037.2590277-8817-9842850860316 `" && echo ansible-tmp-1726773037.2590277-8817-9842850860316="` echo /root/.ansible/tmp/ansible-tmp-1726773037.2590277-8817-9842850860316 `" ) && sleep 0'
  8817 1726773037.28451: stdout chunk (state=2):
>>>ansible-tmp-1726773037.2590277-8817-9842850860316=/root/.ansible/tmp/ansible-tmp-1726773037.2590277-8817-9842850860316
<<<
  8817 1726773037.28589: stderr chunk (state=3):
>>><<<
  8817 1726773037.28597: stdout chunk (state=3):
>>><<<
  8817 1726773037.28613: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773037.2590277-8817-9842850860316=/root/.ansible/tmp/ansible-tmp-1726773037.2590277-8817-9842850860316
, stderr=
  8817 1726773037.28639: variable 'ansible_module_compression' from source: unknown
  8817 1726773037.28676: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED
  8817 1726773037.28725: variable 'ansible_facts' from source: unknown
  8817 1726773037.28879: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773037.2590277-8817-9842850860316/AnsiballZ_setup.py
  8817 1726773037.28991: Sending initial data
  8817 1726773037.28998: Sent initial data (150 bytes)
  8817 1726773037.31758: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpmaifl2xa /root/.ansible/tmp/ansible-tmp-1726773037.2590277-8817-9842850860316/AnsiballZ_setup.py
<<<
  8817 1726773037.36024: stderr chunk (state=3):
>>><<<
  8817 1726773037.36035: stdout chunk (state=3):
>>><<<
  8817 1726773037.36061: done transferring module to remote
  8817 1726773037.36074: _low_level_execute_command(): starting
  8817 1726773037.36080: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773037.2590277-8817-9842850860316/ /root/.ansible/tmp/ansible-tmp-1726773037.2590277-8817-9842850860316/AnsiballZ_setup.py && sleep 0'
  8817 1726773037.39264: stderr chunk (state=2):
>>><<<
  8817 1726773037.39278: stdout chunk (state=2):
>>><<<
  8817 1726773037.39298: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8817 1726773037.39304: _low_level_execute_command(): starting
  8817 1726773037.39311: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773037.2590277-8817-9842850860316/AnsiballZ_setup.py && sleep 0'
  8817 1726773037.69593: stdout chunk (state=2):
>>>
{"ansible_facts": {"ansible_service_mgr": "systemd"}, "invocation": {"module_args": {"gather_subset": ["!all"], "filter": ["ansible_service_mgr"], "gather_timeout": 10, "fact_path": "/etc/ansible/facts.d"}}}
<<<
  8817 1726773037.70992: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  8817 1726773037.71042: stderr chunk (state=3):
>>><<<
  8817 1726773037.71051: stdout chunk (state=3):
>>><<<
  8817 1726773037.71066: _low_level_execute_command() done: rc=0, stdout=
{"ansible_facts": {"ansible_service_mgr": "systemd"}, "invocation": {"module_args": {"gather_subset": ["!all"], "filter": ["ansible_service_mgr"], "gather_timeout": 10, "fact_path": "/etc/ansible/facts.d"}}}
, stderr=Shared connection to 10.31.47.99 closed.
  8817 1726773037.71091: done with _execute_module (ansible.legacy.setup, {'gather_subset': '!all', 'filter': 'ansible_service_mgr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773037.2590277-8817-9842850860316/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  8817 1726773037.71107: Facts {'ansible_facts': {'ansible_service_mgr': 'systemd'}, 'invocation': {'module_args': {'gather_subset': ['!all'], 'filter': ['ansible_service_mgr'], 'gather_timeout': 10, 'fact_path': '/etc/ansible/facts.d'}}, '_ansible_parsed': True}
  8817 1726773037.71159: variable 'ansible_module_compression' from source: unknown
  8817 1726773037.71193: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED
  8817 1726773037.71238: variable 'ansible_facts' from source: unknown
  8817 1726773037.71394: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773037.2590277-8817-9842850860316/AnsiballZ_systemd.py
  8817 1726773037.71500: Sending initial data
  8817 1726773037.71508: Sent initial data (152 bytes)
  8817 1726773037.74208: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpkj20cmnr /root/.ansible/tmp/ansible-tmp-1726773037.2590277-8817-9842850860316/AnsiballZ_systemd.py
<<<
  8817 1726773037.76694: stderr chunk (state=3):
>>><<<
  8817 1726773037.76708: stdout chunk (state=3):
>>><<<
  8817 1726773037.76734: done transferring module to remote
  8817 1726773037.76744: _low_level_execute_command(): starting
  8817 1726773037.76750: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773037.2590277-8817-9842850860316/ /root/.ansible/tmp/ansible-tmp-1726773037.2590277-8817-9842850860316/AnsiballZ_systemd.py && sleep 0'
  8817 1726773037.79239: stderr chunk (state=2):
>>><<<
  8817 1726773037.79249: stdout chunk (state=2):
>>><<<
  8817 1726773037.79264: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8817 1726773037.79269: _low_level_execute_command(): starting
  8817 1726773037.79275: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773037.2590277-8817-9842850860316/AnsiballZ_systemd.py && sleep 0'
  8817 1726773038.33594: stdout chunk (state=2):
>>>
{"name": "tuned", "changed": true, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:03:38 EDT", "WatchdogTimestampMonotonic": "33506369", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "664", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:03:37 EDT", "ExecMainStartTimestampMonotonic": "32243396", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "664", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:03:37 EDT] ; stop_time=[n/a] ; pid=664 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18620416", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "power-profiles-daemon.service auto-cpufreq.service tlp.service cpupower.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "systemd-journald.socket polkit.service systemd-sysctl.service dbus.socket basic.target system.slice network.target dbus.service sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:03:38 EDT", "StateChangeTimestampMonotonic": "33506375", "InactiveExitTimestamp": "Thu 2024-09-19 15:03:37 EDT", "InactiveExitTimestampMonotonic": "32243440", "ActiveEnterTimestamp": "Thu 2024-09-19 15:03:38 EDT", "ActiveEnterTimestampMonotonic": "33506375", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:03:37 EDT", "ConditionTimestampMonotonic": "32242529", "AssertTimestamp": "Thu 2024-09-19 15:03:37 EDT", "AssertTimestampMonotonic": "32242534", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "7582752b17874324b2c9dc01ae0a603c", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "restarted", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}}
<<<
  8817 1726773038.35980: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  8817 1726773038.35997: stdout chunk (state=3):
>>><<<
  8817 1726773038.36010: stderr chunk (state=3):
>>><<<
  8817 1726773038.36030: _low_level_execute_command() done: rc=0, stdout=
{"name": "tuned", "changed": true, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:03:38 EDT", "WatchdogTimestampMonotonic": "33506369", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "664", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:03:37 EDT", "ExecMainStartTimestampMonotonic": "32243396", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "664", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:03:37 EDT] ; stop_time=[n/a] ; pid=664 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18620416", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "power-profiles-daemon.service auto-cpufreq.service tlp.service cpupower.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "systemd-journald.socket polkit.service systemd-sysctl.service dbus.socket basic.target system.slice network.target dbus.service sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:03:38 EDT", "StateChangeTimestampMonotonic": "33506375", "InactiveExitTimestamp": "Thu 2024-09-19 15:03:37 EDT", "InactiveExitTimestampMonotonic": "32243440", "ActiveEnterTimestamp": "Thu 2024-09-19 15:03:38 EDT", "ActiveEnterTimestampMonotonic": "33506375", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:03:37 EDT", "ConditionTimestampMonotonic": "32242529", "AssertTimestamp": "Thu 2024-09-19 15:03:37 EDT", "AssertTimestampMonotonic": "32242534", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "7582752b17874324b2c9dc01ae0a603c", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "restarted", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}}
, stderr=Shared connection to 10.31.47.99 closed.
  8817 1726773038.36213: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'restarted', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773037.2590277-8817-9842850860316/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  8817 1726773038.36238: _low_level_execute_command(): starting
  8817 1726773038.36244: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773037.2590277-8817-9842850860316/ > /dev/null 2>&1 && sleep 0'
  8817 1726773038.41489: stderr chunk (state=2):
>>><<<
  8817 1726773038.41502: stdout chunk (state=2):
>>><<<
  8817 1726773038.41522: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8817 1726773038.41530: handler run complete
  8817 1726773038.41581: attempt loop complete, returning result
  8817 1726773038.41604: variable 'item' from source: unknown
  8817 1726773038.41681: variable 'item' from source: unknown
changed: [managed_node3] => (item=tuned) => {
    "ansible_loop_var": "item",
    "changed": true,
    "enabled": true,
    "item": "tuned",
    "name": "tuned",
    "state": "started",
    "status": {
        "ActiveEnterTimestamp": "Thu 2024-09-19 15:03:38 EDT",
        "ActiveEnterTimestampMonotonic": "33506375",
        "ActiveExitTimestampMonotonic": "0",
        "ActiveState": "active",
        "After": "systemd-journald.socket polkit.service systemd-sysctl.service dbus.socket basic.target system.slice network.target dbus.service sysinit.target",
        "AllowIsolate": "no",
        "AllowedCPUs": "",
        "AllowedMemoryNodes": "",
        "AmbientCapabilities": "",
        "AssertResult": "yes",
        "AssertTimestamp": "Thu 2024-09-19 15:03:37 EDT",
        "AssertTimestampMonotonic": "32242534",
        "Before": "shutdown.target multi-user.target",
        "BlockIOAccounting": "no",
        "BlockIOWeight": "[not set]",
        "BusName": "com.redhat.tuned",
        "CPUAccounting": "no",
        "CPUAffinity": "",
        "CPUAffinityFromNUMA": "no",
        "CPUQuotaPerSecUSec": "infinity",
        "CPUQuotaPeriodUSec": "infinity",
        "CPUSchedulingPolicy": "0",
        "CPUSchedulingPriority": "0",
        "CPUSchedulingResetOnFork": "no",
        "CPUShares": "[not set]",
        "CPUUsageNSec": "[not set]",
        "CPUWeight": "[not set]",
        "CacheDirectoryMode": "0755",
        "CanFreeze": "yes",
        "CanIsolate": "no",
        "CanReload": "no",
        "CanStart": "yes",
        "CanStop": "yes",
        "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf",
        "CollectMode": "inactive",
        "ConditionResult": "yes",
        "ConditionTimestamp": "Thu 2024-09-19 15:03:37 EDT",
        "ConditionTimestampMonotonic": "32242529",
        "ConfigurationDirectoryMode": "0755",
        "Conflicts": "power-profiles-daemon.service auto-cpufreq.service tlp.service cpupower.service shutdown.target",
        "ControlGroup": "/system.slice/tuned.service",
        "ControlPID": "0",
        "DefaultDependencies": "yes",
        "DefaultMemoryLow": "0",
        "DefaultMemoryMin": "0",
        "Delegate": "no",
        "Description": "Dynamic System Tuning Daemon",
        "DevicePolicy": "auto",
        "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)",
        "DynamicUser": "no",
        "EffectiveCPUs": "",
        "EffectiveMemoryNodes": "",
        "ExecMainCode": "0",
        "ExecMainExitTimestampMonotonic": "0",
        "ExecMainPID": "664",
        "ExecMainStartTimestamp": "Thu 2024-09-19 15:03:37 EDT",
        "ExecMainStartTimestampMonotonic": "32243396",
        "ExecMainStatus": "0",
        "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:03:37 EDT] ; stop_time=[n/a] ; pid=664 ; code=(null) ; status=0/0 }",
        "FailureAction": "none",
        "FileDescriptorStoreMax": "0",
        "FragmentPath": "/usr/lib/systemd/system/tuned.service",
        "FreezerState": "running",
        "GID": "[not set]",
        "GuessMainPID": "yes",
        "IOAccounting": "no",
        "IOSchedulingClass": "0",
        "IOSchedulingPriority": "0",
        "IOWeight": "[not set]",
        "IPAccounting": "no",
        "IPEgressBytes": "18446744073709551615",
        "IPEgressPackets": "18446744073709551615",
        "IPIngressBytes": "18446744073709551615",
        "IPIngressPackets": "18446744073709551615",
        "Id": "tuned.service",
        "IgnoreOnIsolate": "no",
        "IgnoreSIGPIPE": "yes",
        "InactiveEnterTimestampMonotonic": "0",
        "InactiveExitTimestamp": "Thu 2024-09-19 15:03:37 EDT",
        "InactiveExitTimestampMonotonic": "32243440",
        "InvocationID": "7582752b17874324b2c9dc01ae0a603c",
        "JobRunningTimeoutUSec": "infinity",
        "JobTimeoutAction": "none",
        "JobTimeoutUSec": "infinity",
        "KeyringMode": "private",
        "KillMode": "control-group",
        "KillSignal": "15",
        "LimitAS": "infinity",
        "LimitASSoft": "infinity",
        "LimitCORE": "infinity",
        "LimitCORESoft": "0",
        "LimitCPU": "infinity",
        "LimitCPUSoft": "infinity",
        "LimitDATA": "infinity",
        "LimitDATASoft": "infinity",
        "LimitFSIZE": "infinity",
        "LimitFSIZESoft": "infinity",
        "LimitLOCKS": "infinity",
        "LimitLOCKSSoft": "infinity",
        "LimitMEMLOCK": "65536",
        "LimitMEMLOCKSoft": "65536",
        "LimitMSGQUEUE": "819200",
        "LimitMSGQUEUESoft": "819200",
        "LimitNICE": "0",
        "LimitNICESoft": "0",
        "LimitNOFILE": "262144",
        "LimitNOFILESoft": "1024",
        "LimitNPROC": "14003",
        "LimitNPROCSoft": "14003",
        "LimitRSS": "infinity",
        "LimitRSSSoft": "infinity",
        "LimitRTPRIO": "0",
        "LimitRTPRIOSoft": "0",
        "LimitRTTIME": "infinity",
        "LimitRTTIMESoft": "infinity",
        "LimitSIGPENDING": "14003",
        "LimitSIGPENDINGSoft": "14003",
        "LimitSTACK": "infinity",
        "LimitSTACKSoft": "8388608",
        "LoadState": "loaded",
        "LockPersonality": "no",
        "LogLevelMax": "-1",
        "LogRateLimitBurst": "0",
        "LogRateLimitIntervalUSec": "0",
        "LogsDirectoryMode": "0755",
        "MainPID": "664",
        "MemoryAccounting": "yes",
        "MemoryCurrent": "18620416",
        "MemoryDenyWriteExecute": "no",
        "MemoryHigh": "infinity",
        "MemoryLimit": "infinity",
        "MemoryLow": "0",
        "MemoryMax": "infinity",
        "MemoryMin": "0",
        "MemorySwapMax": "infinity",
        "MountAPIVFS": "no",
        "MountFlags": "",
        "NFileDescriptorStore": "0",
        "NRestarts": "0",
        "NUMAMask": "",
        "NUMAPolicy": "n/a",
        "Names": "tuned.service",
        "NeedDaemonReload": "no",
        "Nice": "0",
        "NoNewPrivileges": "no",
        "NonBlocking": "no",
        "NotifyAccess": "none",
        "OOMScoreAdjust": "0",
        "OnFailureJobMode": "replace",
        "PIDFile": "/run/tuned/tuned.pid",
        "PermissionsStartOnly": "no",
        "Perpetual": "no",
        "PrivateDevices": "no",
        "PrivateMounts": "no",
        "PrivateNetwork": "no",
        "PrivateTmp": "no",
        "PrivateUsers": "no",
        "ProtectControlGroups": "no",
        "ProtectHome": "no",
        "ProtectKernelModules": "no",
        "ProtectKernelTunables": "no",
        "ProtectSystem": "no",
        "RefuseManualStart": "no",
        "RefuseManualStop": "no",
        "RemainAfterExit": "no",
        "RemoveIPC": "no",
        "Requires": "system.slice sysinit.target dbus.service dbus.socket",
        "Restart": "no",
        "RestartUSec": "100ms",
        "RestrictNamespaces": "no",
        "RestrictRealtime": "no",
        "RestrictSUIDSGID": "no",
        "Result": "success",
        "RootDirectoryStartOnly": "no",
        "RuntimeDirectoryMode": "0755",
        "RuntimeDirectoryPreserve": "no",
        "RuntimeMaxUSec": "infinity",
        "SameProcessGroup": "no",
        "SecureBits": "0",
        "SendSIGHUP": "no",
        "SendSIGKILL": "yes",
        "Slice": "system.slice",
        "StandardError": "inherit",
        "StandardInput": "null",
        "StandardInputData": "",
        "StandardOutput": "journal",
        "StartLimitAction": "none",
        "StartLimitBurst": "5",
        "StartLimitIntervalUSec": "10s",
        "StartupBlockIOWeight": "[not set]",
        "StartupCPUShares": "[not set]",
        "StartupCPUWeight": "[not set]",
        "StartupIOWeight": "[not set]",
        "StateChangeTimestamp": "Thu 2024-09-19 15:03:38 EDT",
        "StateChangeTimestampMonotonic": "33506375",
        "StateDirectoryMode": "0755",
        "StatusErrno": "0",
        "StopWhenUnneeded": "no",
        "SubState": "running",
        "SuccessAction": "none",
        "SyslogFacility": "3",
        "SyslogLevel": "6",
        "SyslogLevelPrefix": "yes",
        "SyslogPriority": "30",
        "SystemCallErrorNumber": "0",
        "TTYReset": "no",
        "TTYVHangup": "no",
        "TTYVTDisallocate": "no",
        "TasksAccounting": "yes",
        "TasksCurrent": "4",
        "TasksMax": "22405",
        "TimeoutStartUSec": "1min 30s",
        "TimeoutStopUSec": "1min 30s",
        "TimerSlackNSec": "50000",
        "Transient": "no",
        "Type": "dbus",
        "UID": "[not set]",
        "UMask": "0022",
        "UnitFilePreset": "enabled",
        "UnitFileState": "enabled",
        "UtmpMode": "init",
        "WantedBy": "multi-user.target",
        "WatchdogTimestamp": "Thu 2024-09-19 15:03:38 EDT",
        "WatchdogTimestampMonotonic": "33506369",
        "WatchdogUSec": "0"
    }
}
  8817 1726773038.41821: dumping result to json
  8817 1726773038.41840: done dumping result, returning
  8817 1726773038.41850: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes [0affffe7-6841-6cfb-81ae-000000000036]
  8817 1726773038.41857: sending task result for task 0affffe7-6841-6cfb-81ae-000000000036
  8817 1726773038.41965: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000036
  8817 1726773038.41970: WORKER PROCESS EXITING
  8303 1726773038.42886: no more pending results, returning what we have
  8303 1726773038.42889: results queue empty
  8303 1726773038.42889: checking for any_errors_fatal
  8303 1726773038.42897: done checking for any_errors_fatal
  8303 1726773038.42897: checking for max_fail_percentage
  8303 1726773038.42899: done checking for max_fail_percentage
  8303 1726773038.42899: checking to see if all hosts have failed and the running result is not ok
  8303 1726773038.42900: done checking to see if all hosts have failed
  8303 1726773038.42900: getting the remaining hosts for this loop
  8303 1726773038.42901: done getting the remaining hosts for this loop
  8303 1726773038.42904: getting the next task for host managed_node3
  8303 1726773038.42909: done getting next task for host managed_node3
  8303 1726773038.42912:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings
  8303 1726773038.42914:  ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773038.42923: getting variables
  8303 1726773038.42925: in VariableManager get_vars()
  8303 1726773038.42950: Calling all_inventory to load vars for managed_node3
  8303 1726773038.42952: Calling groups_inventory to load vars for managed_node3
  8303 1726773038.42955: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773038.42963: Calling all_plugins_play to load vars for managed_node3
  8303 1726773038.42966: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773038.42969: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773038.43019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773038.43058: done with get_vars()
  8303 1726773038.43065: done getting variables
  8303 1726773038.43155: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Tuned apply settings] ********
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157
Thursday 19 September 2024  15:10:38 -0400 (0:00:01.218)       0:00:15.010 **** 
  8303 1726773038.43183: entering _queue_task() for managed_node3/command
  8303 1726773038.43186: Creating lock for command
  8303 1726773038.43391: worker is 1 (out of 1 available)
  8303 1726773038.43405: exiting _queue_task() for managed_node3/command
  8303 1726773038.43417: done queuing things up, now waiting for results queue to drain
  8303 1726773038.43418: waiting for pending results...
  8889 1726773038.43895: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings
  8889 1726773038.44018: in run() - task 0affffe7-6841-6cfb-81ae-000000000037
  8889 1726773038.44036: variable 'ansible_search_path' from source: unknown
  8889 1726773038.44041: variable 'ansible_search_path' from source: unknown
  8889 1726773038.44078: calling self._execute()
  8889 1726773038.44144: variable 'ansible_host' from source: host vars for 'managed_node3'
  8889 1726773038.44153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8889 1726773038.44164: variable 'omit' from source: magic vars
  8889 1726773038.44597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name
  8889 1726773038.44952: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py
  8889 1726773038.44995: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py
  8889 1726773038.45027: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py
  8889 1726773038.45059: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py
  8889 1726773038.45177: variable '__kernel_settings_register_profile' from source: set_fact
  8889 1726773038.45202: Evaluated conditional (not __kernel_settings_register_profile is changed): False
  8889 1726773038.45212: when evaluation is False, skipping this task
  8889 1726773038.45220: _execute() done
  8889 1726773038.45224: dumping result to json
  8889 1726773038.45230: done dumping result, returning
  8889 1726773038.45234: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings [0affffe7-6841-6cfb-81ae-000000000037]
  8889 1726773038.45239: sending task result for task 0affffe7-6841-6cfb-81ae-000000000037
  8889 1726773038.45258: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000037
  8889 1726773038.45260: WORKER PROCESS EXITING
skipping: [managed_node3] => {
    "changed": false,
    "false_condition": "not __kernel_settings_register_profile is changed",
    "skip_reason": "Conditional result was False"
}
  8303 1726773038.45561: no more pending results, returning what we have
  8303 1726773038.45564: results queue empty
  8303 1726773038.45565: checking for any_errors_fatal
  8303 1726773038.45580: done checking for any_errors_fatal
  8303 1726773038.45581: checking for max_fail_percentage
  8303 1726773038.45582: done checking for max_fail_percentage
  8303 1726773038.45582: checking to see if all hosts have failed and the running result is not ok
  8303 1726773038.45583: done checking to see if all hosts have failed
  8303 1726773038.45584: getting the remaining hosts for this loop
  8303 1726773038.45586: done getting the remaining hosts for this loop
  8303 1726773038.45589: getting the next task for host managed_node3
  8303 1726773038.45595: done getting next task for host managed_node3
  8303 1726773038.45598:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings
  8303 1726773038.45601:  ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773038.45613: getting variables
  8303 1726773038.45614: in VariableManager get_vars()
  8303 1726773038.45644: Calling all_inventory to load vars for managed_node3
  8303 1726773038.45647: Calling groups_inventory to load vars for managed_node3
  8303 1726773038.45650: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773038.45658: Calling all_plugins_play to load vars for managed_node3
  8303 1726773038.45660: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773038.45661: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773038.45703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773038.45732: done with get_vars()
  8303 1726773038.45738: done getting variables

TASK [fedora.linux_system_roles.kernel_settings : Verify settings] *************
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:166
Thursday 19 September 2024  15:10:38 -0400 (0:00:00.026)       0:00:15.036 **** 
  8303 1726773038.45807: entering _queue_task() for managed_node3/include_tasks
  8303 1726773038.45977: worker is 1 (out of 1 available)
  8303 1726773038.45994: exiting _queue_task() for managed_node3/include_tasks
  8303 1726773038.46006: done queuing things up, now waiting for results queue to drain
  8303 1726773038.46007: waiting for pending results...
  8893 1726773038.46211: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Verify settings
  8893 1726773038.46336: in run() - task 0affffe7-6841-6cfb-81ae-000000000038
  8893 1726773038.46355: variable 'ansible_search_path' from source: unknown
  8893 1726773038.46360: variable 'ansible_search_path' from source: unknown
  8893 1726773038.46394: calling self._execute()
  8893 1726773038.46461: variable 'ansible_host' from source: host vars for 'managed_node3'
  8893 1726773038.46469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8893 1726773038.46475: variable 'omit' from source: magic vars
  8893 1726773038.46820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name
  8893 1726773038.46996: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py
  8893 1726773038.47030: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py
  8893 1726773038.47058: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py
  8893 1726773038.47089: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py
  8893 1726773038.47189: variable '__kernel_settings_register_apply' from source: set_fact
  8893 1726773038.47215: Evaluated conditional (__kernel_settings_register_apply is changed): True
  8893 1726773038.47224: _execute() done
  8893 1726773038.47228: dumping result to json
  8893 1726773038.47231: done dumping result, returning
  8893 1726773038.47234: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Verify settings [0affffe7-6841-6cfb-81ae-000000000038]
  8893 1726773038.47238: sending task result for task 0affffe7-6841-6cfb-81ae-000000000038
  8893 1726773038.47257: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000038
  8893 1726773038.47258: WORKER PROCESS EXITING
  8303 1726773038.47594: no more pending results, returning what we have
  8303 1726773038.47598: in VariableManager get_vars()
  8303 1726773038.47628: Calling all_inventory to load vars for managed_node3
  8303 1726773038.47632: Calling groups_inventory to load vars for managed_node3
  8303 1726773038.47634: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773038.47643: Calling all_plugins_play to load vars for managed_node3
  8303 1726773038.47645: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773038.47648: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773038.47699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773038.47737: done with get_vars()
  8303 1726773038.47745: variable 'ansible_search_path' from source: unknown
  8303 1726773038.47747: variable 'ansible_search_path' from source: unknown
  8303 1726773038.47784: we have included files to process
  8303 1726773038.47787: generating all_blocks data
  8303 1726773038.47791: done generating all_blocks data
  8303 1726773038.47795: processing included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml
  8303 1726773038.47796: loading included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml
  8303 1726773038.47799: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml
included: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml for managed_node3
  8303 1726773038.48281: done processing included file
  8303 1726773038.48283: iterating over new_blocks loaded from include file
  8303 1726773038.48283: in VariableManager get_vars()
  8303 1726773038.48302: done with get_vars()
  8303 1726773038.48303: filtering new block on tags
  8303 1726773038.48329: done filtering new block on tags
  8303 1726773038.48331: done iterating over new_blocks loaded from include file
  8303 1726773038.48332: extending task lists for all hosts with included blocks
  8303 1726773038.48879: done extending task lists
  8303 1726773038.48880: done processing included files
  8303 1726773038.48880: results queue empty
  8303 1726773038.48881: checking for any_errors_fatal
  8303 1726773038.48884: done checking for any_errors_fatal
  8303 1726773038.48884: checking for max_fail_percentage
  8303 1726773038.48887: done checking for max_fail_percentage
  8303 1726773038.48887: checking to see if all hosts have failed and the running result is not ok
  8303 1726773038.48888: done checking to see if all hosts have failed
  8303 1726773038.48888: getting the remaining hosts for this loop
  8303 1726773038.48889: done getting the remaining hosts for this loop
  8303 1726773038.48891: getting the next task for host managed_node3
  8303 1726773038.48895: done getting next task for host managed_node3
  8303 1726773038.48898:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly
  8303 1726773038.48900:  ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773038.48908: getting variables
  8303 1726773038.48909: in VariableManager get_vars()
  8303 1726773038.48921: Calling all_inventory to load vars for managed_node3
  8303 1726773038.48923: Calling groups_inventory to load vars for managed_node3
  8303 1726773038.48925: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773038.48930: Calling all_plugins_play to load vars for managed_node3
  8303 1726773038.48932: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773038.48934: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773038.48968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773038.49005: done with get_vars()
  8303 1726773038.49012: done getting variables
  8303 1726773038.49049: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:2
Thursday 19 September 2024  15:10:38 -0400 (0:00:00.032)       0:00:15.069 **** 
  8303 1726773038.49083: entering _queue_task() for managed_node3/command
  8303 1726773038.49311: worker is 1 (out of 1 available)
  8303 1726773038.49327: exiting _queue_task() for managed_node3/command
  8303 1726773038.49343: done queuing things up, now waiting for results queue to drain
  8303 1726773038.49344: waiting for pending results...
  8895 1726773038.49988: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly
  8895 1726773038.50139: in run() - task 0affffe7-6841-6cfb-81ae-0000000000f5
  8895 1726773038.50157: variable 'ansible_search_path' from source: unknown
  8895 1726773038.50160: variable 'ansible_search_path' from source: unknown
  8895 1726773038.50198: calling self._execute()
  8895 1726773038.50350: variable 'ansible_host' from source: host vars for 'managed_node3'
  8895 1726773038.50357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8895 1726773038.50363: variable 'omit' from source: magic vars
  8895 1726773038.50437: variable 'omit' from source: magic vars
  8895 1726773038.50481: variable 'omit' from source: magic vars
  8895 1726773038.50506: variable 'omit' from source: magic vars
  8895 1726773038.50539: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  8895 1726773038.50564: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  8895 1726773038.50582: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  8895 1726773038.50597: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8895 1726773038.50606: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8895 1726773038.50629: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  8895 1726773038.50633: variable 'ansible_host' from source: host vars for 'managed_node3'
  8895 1726773038.50636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8895 1726773038.50704: Set connection var ansible_pipelining to False
  8895 1726773038.50712: Set connection var ansible_timeout to 10
  8895 1726773038.50717: Set connection var ansible_module_compression to ZIP_DEFLATED
  8895 1726773038.50721: Set connection var ansible_shell_executable to /bin/sh
  8895 1726773038.50723: Set connection var ansible_connection to ssh
  8895 1726773038.50729: Set connection var ansible_shell_type to sh
  8895 1726773038.50743: variable 'ansible_shell_executable' from source: unknown
  8895 1726773038.50746: variable 'ansible_connection' from source: unknown
  8895 1726773038.50748: variable 'ansible_module_compression' from source: unknown
  8895 1726773038.50750: variable 'ansible_shell_type' from source: unknown
  8895 1726773038.50752: variable 'ansible_shell_executable' from source: unknown
  8895 1726773038.50754: variable 'ansible_host' from source: host vars for 'managed_node3'
  8895 1726773038.50756: variable 'ansible_pipelining' from source: unknown
  8895 1726773038.50758: variable 'ansible_timeout' from source: unknown
  8895 1726773038.50760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8895 1726773038.50881: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  8895 1726773038.50896: variable 'omit' from source: magic vars
  8895 1726773038.50902: starting attempt loop
  8895 1726773038.50906: running the handler
  8895 1726773038.50918: _low_level_execute_command(): starting
  8895 1726773038.50926: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  8895 1726773038.53586: stdout chunk (state=2):
>>>/root
<<<
  8895 1726773038.53715: stderr chunk (state=3):
>>><<<
  8895 1726773038.53724: stdout chunk (state=3):
>>><<<
  8895 1726773038.53745: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  8895 1726773038.53760: _low_level_execute_command(): starting
  8895 1726773038.53766: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773038.537539-8895-149342174962304 `" && echo ansible-tmp-1726773038.537539-8895-149342174962304="` echo /root/.ansible/tmp/ansible-tmp-1726773038.537539-8895-149342174962304 `" ) && sleep 0'
  8895 1726773038.56499: stdout chunk (state=2):
>>>ansible-tmp-1726773038.537539-8895-149342174962304=/root/.ansible/tmp/ansible-tmp-1726773038.537539-8895-149342174962304
<<<
  8895 1726773038.56633: stderr chunk (state=3):
>>><<<
  8895 1726773038.56641: stdout chunk (state=3):
>>><<<
  8895 1726773038.56657: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773038.537539-8895-149342174962304=/root/.ansible/tmp/ansible-tmp-1726773038.537539-8895-149342174962304
, stderr=
  8895 1726773038.56684: variable 'ansible_module_compression' from source: unknown
  8895 1726773038.56738: ANSIBALLZ: Using generic lock for ansible.legacy.command
  8895 1726773038.56744: ANSIBALLZ: Acquiring lock
  8895 1726773038.56748: ANSIBALLZ: Lock acquired: 140242352720640
  8895 1726773038.56752: ANSIBALLZ: Creating module
  8895 1726773038.66563: ANSIBALLZ: Writing module into payload
  8895 1726773038.66646: ANSIBALLZ: Writing module
  8895 1726773038.66671: ANSIBALLZ: Renaming module
  8895 1726773038.66679: ANSIBALLZ: Done creating module
  8895 1726773038.66696: variable 'ansible_facts' from source: unknown
  8895 1726773038.66762: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773038.537539-8895-149342174962304/AnsiballZ_command.py
  8895 1726773038.66870: Sending initial data
  8895 1726773038.66878: Sent initial data (153 bytes)
  8895 1726773038.69575: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpd_x1ewly /root/.ansible/tmp/ansible-tmp-1726773038.537539-8895-149342174962304/AnsiballZ_command.py
<<<
  8895 1726773038.71426: stderr chunk (state=3):
>>><<<
  8895 1726773038.71438: stdout chunk (state=3):
>>><<<
  8895 1726773038.71465: done transferring module to remote
  8895 1726773038.71481: _low_level_execute_command(): starting
  8895 1726773038.71488: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773038.537539-8895-149342174962304/ /root/.ansible/tmp/ansible-tmp-1726773038.537539-8895-149342174962304/AnsiballZ_command.py && sleep 0'
  8895 1726773038.74292: stderr chunk (state=2):
>>><<<
  8895 1726773038.74303: stdout chunk (state=2):
>>><<<
  8895 1726773038.74320: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8895 1726773038.74325: _low_level_execute_command(): starting
  8895 1726773038.74331: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773038.537539-8895-149342174962304/AnsiballZ_command.py && sleep 0'
  8895 1726773039.02708: stdout chunk (state=2):
>>>
{"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 15:10:38.902017", "end": "2024-09-19 15:10:39.025344", "delta": "0:00:00.123327", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}
<<<
  8895 1726773039.03939: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  8895 1726773039.03989: stderr chunk (state=3):
>>><<<
  8895 1726773039.03997: stdout chunk (state=3):
>>><<<
  8895 1726773039.04013: _low_level_execute_command() done: rc=0, stdout=
{"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 15:10:38.902017", "end": "2024-09-19 15:10:39.025344", "delta": "0:00:00.123327", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}
, stderr=Shared connection to 10.31.47.99 closed.
  8895 1726773039.04054: done with _execute_module (ansible.legacy.command, {'_raw_params': 'tuned-adm verify -i', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773038.537539-8895-149342174962304/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  8895 1726773039.04064: _low_level_execute_command(): starting
  8895 1726773039.04074: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773038.537539-8895-149342174962304/ > /dev/null 2>&1 && sleep 0'
  8895 1726773039.06624: stderr chunk (state=2):
>>><<<
  8895 1726773039.06635: stdout chunk (state=2):
>>><<<
  8895 1726773039.06654: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8895 1726773039.06661: handler run complete
  8895 1726773039.06689: Evaluated conditional (False): False
  8895 1726773039.06702: attempt loop complete, returning result
  8895 1726773039.06707: _execute() done
  8895 1726773039.06711: dumping result to json
  8895 1726773039.06716: done dumping result, returning
  8895 1726773039.06725: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly [0affffe7-6841-6cfb-81ae-0000000000f5]
  8895 1726773039.06731: sending task result for task 0affffe7-6841-6cfb-81ae-0000000000f5
  8895 1726773039.06776: done sending task result for task 0affffe7-6841-6cfb-81ae-0000000000f5
  8895 1726773039.06780: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "changed": false,
    "cmd": [
        "tuned-adm",
        "verify",
        "-i"
    ],
    "delta": "0:00:00.123327",
    "end": "2024-09-19 15:10:39.025344",
    "rc": 0,
    "start": "2024-09-19 15:10:38.902017"
}

STDOUT:

Verification succeeded, current system settings match the preset profile.
See TuneD log file ('/var/log/tuned/tuned.log') for details.
  8303 1726773039.07312: no more pending results, returning what we have
  8303 1726773039.07315: results queue empty
  8303 1726773039.07316: checking for any_errors_fatal
  8303 1726773039.07317: done checking for any_errors_fatal
  8303 1726773039.07318: checking for max_fail_percentage
  8303 1726773039.07321: done checking for max_fail_percentage
  8303 1726773039.07322: checking to see if all hosts have failed and the running result is not ok
  8303 1726773039.07323: done checking to see if all hosts have failed
  8303 1726773039.07324: getting the remaining hosts for this loop
  8303 1726773039.07325: done getting the remaining hosts for this loop
  8303 1726773039.07328: getting the next task for host managed_node3
  8303 1726773039.07332: done getting next task for host managed_node3
  8303 1726773039.07334:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log
  8303 1726773039.07336:  ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773039.07343: getting variables
  8303 1726773039.07344: in VariableManager get_vars()
  8303 1726773039.07364: Calling all_inventory to load vars for managed_node3
  8303 1726773039.07367: Calling groups_inventory to load vars for managed_node3
  8303 1726773039.07369: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773039.07375: Calling all_plugins_play to load vars for managed_node3
  8303 1726773039.07377: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773039.07378: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773039.07415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773039.07445: done with get_vars()
  8303 1726773039.07451: done getting variables
  8303 1726773039.07525: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Get last verify results from log] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:12
Thursday 19 September 2024  15:10:39 -0400 (0:00:00.584)       0:00:15.654 **** 
  8303 1726773039.07551: entering _queue_task() for managed_node3/shell
  8303 1726773039.07553: Creating lock for shell
  8303 1726773039.07731: worker is 1 (out of 1 available)
  8303 1726773039.07747: exiting _queue_task() for managed_node3/shell
  8303 1726773039.07759: done queuing things up, now waiting for results queue to drain
  8303 1726773039.07761: waiting for pending results...
  8942 1726773039.07875: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log
  8942 1726773039.07991: in run() - task 0affffe7-6841-6cfb-81ae-0000000000f6
  8942 1726773039.08008: variable 'ansible_search_path' from source: unknown
  8942 1726773039.08012: variable 'ansible_search_path' from source: unknown
  8942 1726773039.08040: calling self._execute()
  8942 1726773039.08093: variable 'ansible_host' from source: host vars for 'managed_node3'
  8942 1726773039.08103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8942 1726773039.08112: variable 'omit' from source: magic vars
  8942 1726773039.08432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name
  8942 1726773039.08613: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py
  8942 1726773039.08648: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py
  8942 1726773039.08677: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py
  8942 1726773039.08705: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py
  8942 1726773039.08788: variable '__kernel_settings_register_verify_values' from source: set_fact
  8942 1726773039.08812: Evaluated conditional (__kernel_settings_register_verify_values is failed): False
  8942 1726773039.08817: when evaluation is False, skipping this task
  8942 1726773039.08820: _execute() done
  8942 1726773039.08824: dumping result to json
  8942 1726773039.08829: done dumping result, returning
  8942 1726773039.08835: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log [0affffe7-6841-6cfb-81ae-0000000000f6]
  8942 1726773039.08841: sending task result for task 0affffe7-6841-6cfb-81ae-0000000000f6
  8942 1726773039.08865: done sending task result for task 0affffe7-6841-6cfb-81ae-0000000000f6
  8942 1726773039.08868: WORKER PROCESS EXITING
skipping: [managed_node3] => {
    "changed": false,
    "false_condition": "__kernel_settings_register_verify_values is failed",
    "skip_reason": "Conditional result was False"
}
  8303 1726773039.08998: no more pending results, returning what we have
  8303 1726773039.09000: results queue empty
  8303 1726773039.09001: checking for any_errors_fatal
  8303 1726773039.09010: done checking for any_errors_fatal
  8303 1726773039.09011: checking for max_fail_percentage
  8303 1726773039.09012: done checking for max_fail_percentage
  8303 1726773039.09013: checking to see if all hosts have failed and the running result is not ok
  8303 1726773039.09013: done checking to see if all hosts have failed
  8303 1726773039.09014: getting the remaining hosts for this loop
  8303 1726773039.09015: done getting the remaining hosts for this loop
  8303 1726773039.09018: getting the next task for host managed_node3
  8303 1726773039.09023: done getting next task for host managed_node3
  8303 1726773039.09032:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors
  8303 1726773039.09036:  ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773039.09049: getting variables
  8303 1726773039.09050: in VariableManager get_vars()
  8303 1726773039.09081: Calling all_inventory to load vars for managed_node3
  8303 1726773039.09084: Calling groups_inventory to load vars for managed_node3
  8303 1726773039.09087: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773039.09098: Calling all_plugins_play to load vars for managed_node3
  8303 1726773039.09099: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773039.09101: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773039.09144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773039.09191: done with get_vars()
  8303 1726773039.09198: done getting variables
  8303 1726773039.09250: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:23
Thursday 19 September 2024  15:10:39 -0400 (0:00:00.017)       0:00:15.671 **** 
  8303 1726773039.09277: entering _queue_task() for managed_node3/fail
  8303 1726773039.09479: worker is 1 (out of 1 available)
  8303 1726773039.09498: exiting _queue_task() for managed_node3/fail
  8303 1726773039.09509: done queuing things up, now waiting for results queue to drain
  8303 1726773039.09510: waiting for pending results...
  8943 1726773039.09628: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors
  8943 1726773039.09747: in run() - task 0affffe7-6841-6cfb-81ae-0000000000f7
  8943 1726773039.09759: variable 'ansible_search_path' from source: unknown
  8943 1726773039.09762: variable 'ansible_search_path' from source: unknown
  8943 1726773039.09790: calling self._execute()
  8943 1726773039.09840: variable 'ansible_host' from source: host vars for 'managed_node3'
  8943 1726773039.09847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8943 1726773039.09853: variable 'omit' from source: magic vars
  8943 1726773039.10227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name
  8943 1726773039.10448: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py
  8943 1726773039.10682: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py
  8943 1726773039.10780: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py
  8943 1726773039.10817: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py
  8943 1726773039.10925: variable '__kernel_settings_register_verify_values' from source: set_fact
  8943 1726773039.10952: Evaluated conditional (__kernel_settings_register_verify_values is failed): False
  8943 1726773039.10957: when evaluation is False, skipping this task
  8943 1726773039.10961: _execute() done
  8943 1726773039.10964: dumping result to json
  8943 1726773039.10967: done dumping result, returning
  8943 1726773039.10973: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors [0affffe7-6841-6cfb-81ae-0000000000f7]
  8943 1726773039.10980: sending task result for task 0affffe7-6841-6cfb-81ae-0000000000f7
  8943 1726773039.11011: done sending task result for task 0affffe7-6841-6cfb-81ae-0000000000f7
  8943 1726773039.11014: WORKER PROCESS EXITING
skipping: [managed_node3] => {
    "changed": false,
    "false_condition": "__kernel_settings_register_verify_values is failed",
    "skip_reason": "Conditional result was False"
}
  8303 1726773039.11354: no more pending results, returning what we have
  8303 1726773039.11356: results queue empty
  8303 1726773039.11356: checking for any_errors_fatal
  8303 1726773039.11359: done checking for any_errors_fatal
  8303 1726773039.11360: checking for max_fail_percentage
  8303 1726773039.11361: done checking for max_fail_percentage
  8303 1726773039.11362: checking to see if all hosts have failed and the running result is not ok
  8303 1726773039.11362: done checking to see if all hosts have failed
  8303 1726773039.11362: getting the remaining hosts for this loop
  8303 1726773039.11363: done getting the remaining hosts for this loop
  8303 1726773039.11368: getting the next task for host managed_node3
  8303 1726773039.11373: done getting next task for host managed_node3
  8303 1726773039.11376:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes
  8303 1726773039.11378:  ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773039.11389: getting variables
  8303 1726773039.11390: in VariableManager get_vars()
  8303 1726773039.11414: Calling all_inventory to load vars for managed_node3
  8303 1726773039.11416: Calling groups_inventory to load vars for managed_node3
  8303 1726773039.11418: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773039.11432: Calling all_plugins_play to load vars for managed_node3
  8303 1726773039.11434: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773039.11436: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773039.11474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773039.11507: done with get_vars()
  8303 1726773039.11514: done getting variables
  8303 1726773039.11556: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:177
Thursday 19 September 2024  15:10:39 -0400 (0:00:00.023)       0:00:15.694 **** 
  8303 1726773039.11581: entering _queue_task() for managed_node3/set_fact
  8303 1726773039.11751: worker is 1 (out of 1 available)
  8303 1726773039.11768: exiting _queue_task() for managed_node3/set_fact
  8303 1726773039.11780: done queuing things up, now waiting for results queue to drain
  8303 1726773039.11780: waiting for pending results...
  8945 1726773039.11891: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes
  8945 1726773039.11991: in run() - task 0affffe7-6841-6cfb-81ae-000000000039
  8945 1726773039.12005: variable 'ansible_search_path' from source: unknown
  8945 1726773039.12009: variable 'ansible_search_path' from source: unknown
  8945 1726773039.12036: calling self._execute()
  8945 1726773039.12087: variable 'ansible_host' from source: host vars for 'managed_node3'
  8945 1726773039.12094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8945 1726773039.12100: variable 'omit' from source: magic vars
  8945 1726773039.12168: variable 'omit' from source: magic vars
  8945 1726773039.12201: variable 'omit' from source: magic vars
  8945 1726773039.12222: variable 'omit' from source: magic vars
  8945 1726773039.12252: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  8945 1726773039.12278: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  8945 1726773039.12300: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  8945 1726773039.12318: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8945 1726773039.12329: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8945 1726773039.12352: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  8945 1726773039.12356: variable 'ansible_host' from source: host vars for 'managed_node3'
  8945 1726773039.12358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8945 1726773039.12424: Set connection var ansible_pipelining to False
  8945 1726773039.12433: Set connection var ansible_timeout to 10
  8945 1726773039.12437: Set connection var ansible_module_compression to ZIP_DEFLATED
  8945 1726773039.12441: Set connection var ansible_shell_executable to /bin/sh
  8945 1726773039.12444: Set connection var ansible_connection to ssh
  8945 1726773039.12448: Set connection var ansible_shell_type to sh
  8945 1726773039.12461: variable 'ansible_shell_executable' from source: unknown
  8945 1726773039.12464: variable 'ansible_connection' from source: unknown
  8945 1726773039.12467: variable 'ansible_module_compression' from source: unknown
  8945 1726773039.12469: variable 'ansible_shell_type' from source: unknown
  8945 1726773039.12471: variable 'ansible_shell_executable' from source: unknown
  8945 1726773039.12472: variable 'ansible_host' from source: host vars for 'managed_node3'
  8945 1726773039.12477: variable 'ansible_pipelining' from source: unknown
  8945 1726773039.12479: variable 'ansible_timeout' from source: unknown
  8945 1726773039.12481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8945 1726773039.12581: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  8945 1726773039.12595: variable 'omit' from source: magic vars
  8945 1726773039.12601: starting attempt loop
  8945 1726773039.12605: running the handler
  8945 1726773039.12614: handler run complete
  8945 1726773039.12622: attempt loop complete, returning result
  8945 1726773039.12625: _execute() done
  8945 1726773039.12628: dumping result to json
  8945 1726773039.12632: done dumping result, returning
  8945 1726773039.12638: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes [0affffe7-6841-6cfb-81ae-000000000039]
  8945 1726773039.12644: sending task result for task 0affffe7-6841-6cfb-81ae-000000000039
  8945 1726773039.12666: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000039
  8945 1726773039.12669: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "ansible_facts": {
        "kernel_settings_reboot_required": false
    },
    "changed": false
}
  8303 1726773039.12817: no more pending results, returning what we have
  8303 1726773039.12820: results queue empty
  8303 1726773039.12821: checking for any_errors_fatal
  8303 1726773039.12826: done checking for any_errors_fatal
  8303 1726773039.12827: checking for max_fail_percentage
  8303 1726773039.12828: done checking for max_fail_percentage
  8303 1726773039.12828: checking to see if all hosts have failed and the running result is not ok
  8303 1726773039.12829: done checking to see if all hosts have failed
  8303 1726773039.12830: getting the remaining hosts for this loop
  8303 1726773039.12831: done getting the remaining hosts for this loop
  8303 1726773039.12833: getting the next task for host managed_node3
  8303 1726773039.12838: done getting next task for host managed_node3
  8303 1726773039.12840:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing
  8303 1726773039.12842:  ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773039.12848: getting variables
  8303 1726773039.12848: in VariableManager get_vars()
  8303 1726773039.12875: Calling all_inventory to load vars for managed_node3
  8303 1726773039.12876: Calling groups_inventory to load vars for managed_node3
  8303 1726773039.12878: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773039.12884: Calling all_plugins_play to load vars for managed_node3
  8303 1726773039.12887: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773039.12889: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773039.12922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773039.12949: done with get_vars()
  8303 1726773039.12953: done getting variables
  8303 1726773039.12995: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:181
Thursday 19 September 2024  15:10:39 -0400 (0:00:00.014)       0:00:15.708 **** 
  8303 1726773039.13016: entering _queue_task() for managed_node3/set_fact
  8303 1726773039.13187: worker is 1 (out of 1 available)
  8303 1726773039.13200: exiting _queue_task() for managed_node3/set_fact
  8303 1726773039.13214: done queuing things up, now waiting for results queue to drain
  8303 1726773039.13215: waiting for pending results...
  8946 1726773039.13325: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing
  8946 1726773039.13424: in run() - task 0affffe7-6841-6cfb-81ae-00000000003a
  8946 1726773039.13437: variable 'ansible_search_path' from source: unknown
  8946 1726773039.13440: variable 'ansible_search_path' from source: unknown
  8946 1726773039.13464: calling self._execute()
  8946 1726773039.13526: variable 'ansible_host' from source: host vars for 'managed_node3'
  8946 1726773039.13534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8946 1726773039.13544: variable 'omit' from source: magic vars
  8946 1726773039.13615: variable 'omit' from source: magic vars
  8946 1726773039.13649: variable 'omit' from source: magic vars
  8946 1726773039.14035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name
  8946 1726773039.14329: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py
  8946 1726773039.14375: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py
  8946 1726773039.14407: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py
  8946 1726773039.14437: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py
  8946 1726773039.14577: variable '__kernel_settings_register_profile' from source: set_fact
  8946 1726773039.14594: variable '__kernel_settings_register_mode' from source: set_fact
  8946 1726773039.14602: variable '__kernel_settings_register_apply' from source: set_fact
  8946 1726773039.14649: variable 'omit' from source: magic vars
  8946 1726773039.14679: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  8946 1726773039.14707: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  8946 1726773039.14724: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  8946 1726773039.14741: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8946 1726773039.14752: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8946 1726773039.14782: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  8946 1726773039.14789: variable 'ansible_host' from source: host vars for 'managed_node3'
  8946 1726773039.14793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8946 1726773039.14883: Set connection var ansible_pipelining to False
  8946 1726773039.14896: Set connection var ansible_timeout to 10
  8946 1726773039.14902: Set connection var ansible_module_compression to ZIP_DEFLATED
  8946 1726773039.14909: Set connection var ansible_shell_executable to /bin/sh
  8946 1726773039.14912: Set connection var ansible_connection to ssh
  8946 1726773039.14919: Set connection var ansible_shell_type to sh
  8946 1726773039.14949: variable 'ansible_shell_executable' from source: unknown
  8946 1726773039.14955: variable 'ansible_connection' from source: unknown
  8946 1726773039.14958: variable 'ansible_module_compression' from source: unknown
  8946 1726773039.14962: variable 'ansible_shell_type' from source: unknown
  8946 1726773039.14965: variable 'ansible_shell_executable' from source: unknown
  8946 1726773039.14970: variable 'ansible_host' from source: host vars for 'managed_node3'
  8946 1726773039.14974: variable 'ansible_pipelining' from source: unknown
  8946 1726773039.14976: variable 'ansible_timeout' from source: unknown
  8946 1726773039.14980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8946 1726773039.15092: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  8946 1726773039.15103: variable 'omit' from source: magic vars
  8946 1726773039.15109: starting attempt loop
  8946 1726773039.15113: running the handler
  8946 1726773039.15121: handler run complete
  8946 1726773039.15130: attempt loop complete, returning result
  8946 1726773039.15134: _execute() done
  8946 1726773039.15137: dumping result to json
  8946 1726773039.15141: done dumping result, returning
  8946 1726773039.15148: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing [0affffe7-6841-6cfb-81ae-00000000003a]
  8946 1726773039.15154: sending task result for task 0affffe7-6841-6cfb-81ae-00000000003a
  8946 1726773039.15175: done sending task result for task 0affffe7-6841-6cfb-81ae-00000000003a
  8946 1726773039.15178: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "ansible_facts": {
        "__kernel_settings_changed": true
    },
    "changed": false
}
  8303 1726773039.15295: no more pending results, returning what we have
  8303 1726773039.15298: results queue empty
  8303 1726773039.15299: checking for any_errors_fatal
  8303 1726773039.15303: done checking for any_errors_fatal
  8303 1726773039.15304: checking for max_fail_percentage
  8303 1726773039.15305: done checking for max_fail_percentage
  8303 1726773039.15306: checking to see if all hosts have failed and the running result is not ok
  8303 1726773039.15306: done checking to see if all hosts have failed
  8303 1726773039.15307: getting the remaining hosts for this loop
  8303 1726773039.15308: done getting the remaining hosts for this loop
  8303 1726773039.15311: getting the next task for host managed_node3
  8303 1726773039.15318: done getting next task for host managed_node3
  8303 1726773039.15320:  ^ task is: TASK: meta (role_complete)
  8303 1726773039.15322:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773039.15330: getting variables
  8303 1726773039.15331: in VariableManager get_vars()
  8303 1726773039.15361: Calling all_inventory to load vars for managed_node3
  8303 1726773039.15363: Calling groups_inventory to load vars for managed_node3
  8303 1726773039.15365: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773039.15373: Calling all_plugins_play to load vars for managed_node3
  8303 1726773039.15375: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773039.15377: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773039.15424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773039.15452: done with get_vars()
  8303 1726773039.15457: done getting variables
  8303 1726773039.15512: done queuing things up, now waiting for results queue to drain
  8303 1726773039.15513: results queue empty
  8303 1726773039.15514: checking for any_errors_fatal
  8303 1726773039.15516: done checking for any_errors_fatal
  8303 1726773039.15516: checking for max_fail_percentage
  8303 1726773039.15517: done checking for max_fail_percentage
  8303 1726773039.15522: checking to see if all hosts have failed and the running result is not ok
  8303 1726773039.15522: done checking to see if all hosts have failed
  8303 1726773039.15522: getting the remaining hosts for this loop
  8303 1726773039.15523: done getting the remaining hosts for this loop
  8303 1726773039.15524: getting the next task for host managed_node3
  8303 1726773039.15527: done getting next task for host managed_node3
  8303 1726773039.15528:  ^ task is: TASK: Cleanup
  8303 1726773039.15529:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773039.15531: getting variables
  8303 1726773039.15531: in VariableManager get_vars()
  8303 1726773039.15540: Calling all_inventory to load vars for managed_node3
  8303 1726773039.15541: Calling groups_inventory to load vars for managed_node3
  8303 1726773039.15542: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773039.15545: Calling all_plugins_play to load vars for managed_node3
  8303 1726773039.15546: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773039.15547: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773039.15569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773039.15586: done with get_vars()
  8303 1726773039.15590: done getting variables

TASK [Cleanup] *****************************************************************
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_default.yml:14
Thursday 19 September 2024  15:10:39 -0400 (0:00:00.026)       0:00:15.735 **** 
  8303 1726773039.15672: entering _queue_task() for managed_node3/include_tasks
  8303 1726773039.15831: worker is 1 (out of 1 available)
  8303 1726773039.15847: exiting _queue_task() for managed_node3/include_tasks
  8303 1726773039.15858: done queuing things up, now waiting for results queue to drain
  8303 1726773039.15859: waiting for pending results...
  8949 1726773039.15973: running TaskExecutor() for managed_node3/TASK: Cleanup
  8949 1726773039.16064: in run() - task 0affffe7-6841-6cfb-81ae-000000000007
  8949 1726773039.16081: variable 'ansible_search_path' from source: unknown
  8949 1726773039.16114: calling self._execute()
  8949 1726773039.16171: variable 'ansible_host' from source: host vars for 'managed_node3'
  8949 1726773039.16182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8949 1726773039.16191: variable 'omit' from source: magic vars
  8949 1726773039.16261: _execute() done
  8949 1726773039.16268: dumping result to json
  8949 1726773039.16272: done dumping result, returning
  8949 1726773039.16275: done running TaskExecutor() for managed_node3/TASK: Cleanup [0affffe7-6841-6cfb-81ae-000000000007]
  8949 1726773039.16280: sending task result for task 0affffe7-6841-6cfb-81ae-000000000007
  8949 1726773039.16303: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000007
  8949 1726773039.16306: WORKER PROCESS EXITING
  8303 1726773039.16528: no more pending results, returning what we have
  8303 1726773039.16531: in VariableManager get_vars()
  8303 1726773039.16558: Calling all_inventory to load vars for managed_node3
  8303 1726773039.16559: Calling groups_inventory to load vars for managed_node3
  8303 1726773039.16561: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773039.16567: Calling all_plugins_play to load vars for managed_node3
  8303 1726773039.16569: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773039.16571: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773039.16606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773039.16626: done with get_vars()
  8303 1726773039.16630: variable 'ansible_search_path' from source: unknown
  8303 1726773039.16639: we have included files to process
  8303 1726773039.16640: generating all_blocks data
  8303 1726773039.16640: done generating all_blocks data
  8303 1726773039.16643: processing included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml
  8303 1726773039.16644: loading included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml
  8303 1726773039.16645: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml
included: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml for managed_node3
  8303 1726773039.17381: done processing included file
  8303 1726773039.17387: iterating over new_blocks loaded from include file
  8303 1726773039.17388: in VariableManager get_vars()
  8303 1726773039.17399: done with get_vars()
  8303 1726773039.17400: filtering new block on tags
  8303 1726773039.17411: done filtering new block on tags
  8303 1726773039.17412: in VariableManager get_vars()
  8303 1726773039.17435: done with get_vars()
  8303 1726773039.17437: filtering new block on tags
  8303 1726773039.17462: done filtering new block on tags
  8303 1726773039.17464: done iterating over new_blocks loaded from include file
  8303 1726773039.17465: extending task lists for all hosts with included blocks
  8303 1726773039.18410: done extending task lists
  8303 1726773039.18411: done processing included files
  8303 1726773039.18412: results queue empty
  8303 1726773039.18412: checking for any_errors_fatal
  8303 1726773039.18413: done checking for any_errors_fatal
  8303 1726773039.18414: checking for max_fail_percentage
  8303 1726773039.18414: done checking for max_fail_percentage
  8303 1726773039.18414: checking to see if all hosts have failed and the running result is not ok
  8303 1726773039.18415: done checking to see if all hosts have failed
  8303 1726773039.18415: getting the remaining hosts for this loop
  8303 1726773039.18416: done getting the remaining hosts for this loop
  8303 1726773039.18418: getting the next task for host managed_node3
  8303 1726773039.18421: done getting next task for host managed_node3
  8303 1726773039.18422:  ^ task is: TASK: Show current tuned profile settings
  8303 1726773039.18423:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773039.18425: getting variables
  8303 1726773039.18425: in VariableManager get_vars()
  8303 1726773039.18433: Calling all_inventory to load vars for managed_node3
  8303 1726773039.18435: Calling groups_inventory to load vars for managed_node3
  8303 1726773039.18436: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773039.18440: Calling all_plugins_play to load vars for managed_node3
  8303 1726773039.18442: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773039.18443: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773039.18468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773039.18489: done with get_vars()
  8303 1726773039.18494: done getting variables
  8303 1726773039.18520: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [Show current tuned profile settings] *************************************
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:2
Thursday 19 September 2024  15:10:39 -0400 (0:00:00.028)       0:00:15.764 **** 
  8303 1726773039.18540: entering _queue_task() for managed_node3/command
  8303 1726773039.18748: worker is 1 (out of 1 available)
  8303 1726773039.18762: exiting _queue_task() for managed_node3/command
  8303 1726773039.18775: done queuing things up, now waiting for results queue to drain
  8303 1726773039.18776: waiting for pending results...
  8950 1726773039.18983: running TaskExecutor() for managed_node3/TASK: Show current tuned profile settings
  8950 1726773039.19106: in run() - task 0affffe7-6841-6cfb-81ae-000000000151
  8950 1726773039.19124: variable 'ansible_search_path' from source: unknown
  8950 1726773039.19128: variable 'ansible_search_path' from source: unknown
  8950 1726773039.19164: calling self._execute()
  8950 1726773039.19231: variable 'ansible_host' from source: host vars for 'managed_node3'
  8950 1726773039.19241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8950 1726773039.19248: variable 'omit' from source: magic vars
  8950 1726773039.19346: variable 'omit' from source: magic vars
  8950 1726773039.19389: variable 'omit' from source: magic vars
  8950 1726773039.19667: variable '__kernel_settings_profile_filename' from source: role '' exported vars
  8950 1726773039.19724: variable '__kernel_settings_profile_dir' from source: role '' exported vars
  8950 1726773039.19815: variable '__kernel_settings_profile_parent' from source: set_fact
  8950 1726773039.19825: variable '__kernel_settings_tuned_profile' from source: role '' exported vars
  8950 1726773039.19859: variable 'omit' from source: magic vars
  8950 1726773039.19896: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  8950 1726773039.19921: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  8950 1726773039.19940: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  8950 1726773039.19954: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8950 1726773039.19966: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8950 1726773039.19992: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  8950 1726773039.19997: variable 'ansible_host' from source: host vars for 'managed_node3'
  8950 1726773039.20002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8950 1726773039.20070: Set connection var ansible_pipelining to False
  8950 1726773039.20081: Set connection var ansible_timeout to 10
  8950 1726773039.20089: Set connection var ansible_module_compression to ZIP_DEFLATED
  8950 1726773039.20095: Set connection var ansible_shell_executable to /bin/sh
  8950 1726773039.20098: Set connection var ansible_connection to ssh
  8950 1726773039.20104: Set connection var ansible_shell_type to sh
  8950 1726773039.20120: variable 'ansible_shell_executable' from source: unknown
  8950 1726773039.20123: variable 'ansible_connection' from source: unknown
  8950 1726773039.20126: variable 'ansible_module_compression' from source: unknown
  8950 1726773039.20130: variable 'ansible_shell_type' from source: unknown
  8950 1726773039.20133: variable 'ansible_shell_executable' from source: unknown
  8950 1726773039.20137: variable 'ansible_host' from source: host vars for 'managed_node3'
  8950 1726773039.20141: variable 'ansible_pipelining' from source: unknown
  8950 1726773039.20144: variable 'ansible_timeout' from source: unknown
  8950 1726773039.20148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8950 1726773039.20238: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  8950 1726773039.20249: variable 'omit' from source: magic vars
  8950 1726773039.20255: starting attempt loop
  8950 1726773039.20258: running the handler
  8950 1726773039.20270: _low_level_execute_command(): starting
  8950 1726773039.20278: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  8950 1726773039.22698: stdout chunk (state=2):
>>>/root
<<<
  8950 1726773039.22820: stderr chunk (state=3):
>>><<<
  8950 1726773039.22827: stdout chunk (state=3):
>>><<<
  8950 1726773039.22846: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  8950 1726773039.22860: _low_level_execute_command(): starting
  8950 1726773039.22867: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773039.2285433-8950-272981848761206 `" && echo ansible-tmp-1726773039.2285433-8950-272981848761206="` echo /root/.ansible/tmp/ansible-tmp-1726773039.2285433-8950-272981848761206 `" ) && sleep 0'
  8950 1726773039.25458: stdout chunk (state=2):
>>>ansible-tmp-1726773039.2285433-8950-272981848761206=/root/.ansible/tmp/ansible-tmp-1726773039.2285433-8950-272981848761206
<<<
  8950 1726773039.25598: stderr chunk (state=3):
>>><<<
  8950 1726773039.25606: stdout chunk (state=3):
>>><<<
  8950 1726773039.25623: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773039.2285433-8950-272981848761206=/root/.ansible/tmp/ansible-tmp-1726773039.2285433-8950-272981848761206
, stderr=
  8950 1726773039.25649: variable 'ansible_module_compression' from source: unknown
  8950 1726773039.25695: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED
  8950 1726773039.25726: variable 'ansible_facts' from source: unknown
  8950 1726773039.25805: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773039.2285433-8950-272981848761206/AnsiballZ_command.py
  8950 1726773039.25913: Sending initial data
  8950 1726773039.25920: Sent initial data (154 bytes)
  8950 1726773039.28615: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpx8lbgc2l /root/.ansible/tmp/ansible-tmp-1726773039.2285433-8950-272981848761206/AnsiballZ_command.py
<<<
  8950 1726773039.29833: stderr chunk (state=3):
>>><<<
  8950 1726773039.29847: stdout chunk (state=3):
>>><<<
  8950 1726773039.29868: done transferring module to remote
  8950 1726773039.29880: _low_level_execute_command(): starting
  8950 1726773039.29886: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773039.2285433-8950-272981848761206/ /root/.ansible/tmp/ansible-tmp-1726773039.2285433-8950-272981848761206/AnsiballZ_command.py && sleep 0'
  8950 1726773039.32407: stderr chunk (state=2):
>>><<<
  8950 1726773039.32419: stdout chunk (state=2):
>>><<<
  8950 1726773039.32435: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8950 1726773039.32440: _low_level_execute_command(): starting
  8950 1726773039.32445: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773039.2285433-8950-272981848761206/AnsiballZ_command.py && sleep 0'
  8950 1726773039.48120: stdout chunk (state=2):
>>>
{"changed": true, "stdout": "#\n# Ansible managed\n#\n# system_role:kernel_settings\n\n[main]\nsummary = kernel settings", "stderr": "", "rc": 0, "cmd": ["cat", "/etc/tuned/kernel_settings/tuned.conf"], "start": "2024-09-19 15:10:39.476703", "end": "2024-09-19 15:10:39.479617", "delta": "0:00:00.002914", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /etc/tuned/kernel_settings/tuned.conf", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}
<<<
  8950 1726773039.49531: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  8950 1726773039.49547: stdout chunk (state=3):
>>><<<
  8950 1726773039.49559: stderr chunk (state=3):
>>><<<
  8950 1726773039.49576: _low_level_execute_command() done: rc=0, stdout=
{"changed": true, "stdout": "#\n# Ansible managed\n#\n# system_role:kernel_settings\n\n[main]\nsummary = kernel settings", "stderr": "", "rc": 0, "cmd": ["cat", "/etc/tuned/kernel_settings/tuned.conf"], "start": "2024-09-19 15:10:39.476703", "end": "2024-09-19 15:10:39.479617", "delta": "0:00:00.002914", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /etc/tuned/kernel_settings/tuned.conf", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}
, stderr=Shared connection to 10.31.47.99 closed.
  8950 1726773039.49621: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773039.2285433-8950-272981848761206/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  8950 1726773039.49632: _low_level_execute_command(): starting
  8950 1726773039.49638: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773039.2285433-8950-272981848761206/ > /dev/null 2>&1 && sleep 0'
  8950 1726773039.52430: stderr chunk (state=2):
>>><<<
  8950 1726773039.52441: stdout chunk (state=2):
>>><<<
  8950 1726773039.52458: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8950 1726773039.52463: handler run complete
  8950 1726773039.52490: Evaluated conditional (False): False
  8950 1726773039.52500: attempt loop complete, returning result
  8950 1726773039.52503: _execute() done
  8950 1726773039.52506: dumping result to json
  8950 1726773039.52511: done dumping result, returning
  8950 1726773039.52518: done running TaskExecutor() for managed_node3/TASK: Show current tuned profile settings [0affffe7-6841-6cfb-81ae-000000000151]
  8950 1726773039.52523: sending task result for task 0affffe7-6841-6cfb-81ae-000000000151
  8950 1726773039.52559: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000151
  8950 1726773039.52562: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/tuned/kernel_settings/tuned.conf"
    ],
    "delta": "0:00:00.002914",
    "end": "2024-09-19 15:10:39.479617",
    "rc": 0,
    "start": "2024-09-19 15:10:39.476703"
}

STDOUT:

#
# Ansible managed
#
# system_role:kernel_settings

[main]
summary = kernel settings
  8303 1726773039.53367: no more pending results, returning what we have
  8303 1726773039.53371: results queue empty
  8303 1726773039.53372: checking for any_errors_fatal
  8303 1726773039.53373: done checking for any_errors_fatal
  8303 1726773039.53374: checking for max_fail_percentage
  8303 1726773039.53376: done checking for max_fail_percentage
  8303 1726773039.53376: checking to see if all hosts have failed and the running result is not ok
  8303 1726773039.53377: done checking to see if all hosts have failed
  8303 1726773039.53377: getting the remaining hosts for this loop
  8303 1726773039.53378: done getting the remaining hosts for this loop
  8303 1726773039.53382: getting the next task for host managed_node3
  8303 1726773039.53392: done getting next task for host managed_node3
  8303 1726773039.53395:  ^ task is: TASK: Run role with purge to remove everything
  8303 1726773039.53397:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773039.53400: getting variables
  8303 1726773039.53401: in VariableManager get_vars()
  8303 1726773039.53432: Calling all_inventory to load vars for managed_node3
  8303 1726773039.53435: Calling groups_inventory to load vars for managed_node3
  8303 1726773039.53437: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773039.53446: Calling all_plugins_play to load vars for managed_node3
  8303 1726773039.53449: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773039.53452: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773039.53502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773039.53544: done with get_vars()
  8303 1726773039.53552: done getting variables

TASK [Run role with purge to remove everything] ********************************
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:9
Thursday 19 September 2024  15:10:39 -0400 (0:00:00.350)       0:00:16.115 **** 
  8303 1726773039.53642: entering _queue_task() for managed_node3/include_role
  8303 1726773039.53849: worker is 1 (out of 1 available)
  8303 1726773039.53863: exiting _queue_task() for managed_node3/include_role
  8303 1726773039.53876: done queuing things up, now waiting for results queue to drain
  8303 1726773039.53877: waiting for pending results...
  8977 1726773039.54434: running TaskExecutor() for managed_node3/TASK: Run role with purge to remove everything
  8977 1726773039.54563: in run() - task 0affffe7-6841-6cfb-81ae-000000000153
  8977 1726773039.54580: variable 'ansible_search_path' from source: unknown
  8977 1726773039.54584: variable 'ansible_search_path' from source: unknown
  8977 1726773039.54619: calling self._execute()
  8977 1726773039.54743: variable 'ansible_host' from source: host vars for 'managed_node3'
  8977 1726773039.54753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8977 1726773039.54762: variable 'omit' from source: magic vars
  8977 1726773039.54855: _execute() done
  8977 1726773039.54861: dumping result to json
  8977 1726773039.54865: done dumping result, returning
  8977 1726773039.54870: done running TaskExecutor() for managed_node3/TASK: Run role with purge to remove everything [0affffe7-6841-6cfb-81ae-000000000153]
  8977 1726773039.54878: sending task result for task 0affffe7-6841-6cfb-81ae-000000000153
  8977 1726773039.54912: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000153
  8977 1726773039.54916: WORKER PROCESS EXITING
  8303 1726773039.55223: no more pending results, returning what we have
  8303 1726773039.55227: in VariableManager get_vars()
  8303 1726773039.55305: Calling all_inventory to load vars for managed_node3
  8303 1726773039.55307: Calling groups_inventory to load vars for managed_node3
  8303 1726773039.55310: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773039.55318: Calling all_plugins_play to load vars for managed_node3
  8303 1726773039.55321: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773039.55324: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773039.55372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773039.55405: done with get_vars()
  8303 1726773039.55410: variable 'ansible_search_path' from source: unknown
  8303 1726773039.55411: variable 'ansible_search_path' from source: unknown
  8303 1726773039.55709: variable 'omit' from source: magic vars
  8303 1726773039.55742: variable 'omit' from source: magic vars
  8303 1726773039.55756: variable 'omit' from source: magic vars
  8303 1726773039.55760: we have included files to process
  8303 1726773039.55761: generating all_blocks data
  8303 1726773039.55762: done generating all_blocks data
  8303 1726773039.55765: processing included file: fedora.linux_system_roles.kernel_settings
  8303 1726773039.55789: in VariableManager get_vars()
  8303 1726773039.55804: done with get_vars()
  8303 1726773039.55831: in VariableManager get_vars()
  8303 1726773039.55848: done with get_vars()
  8303 1726773039.55887: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/main.yml
  8303 1726773039.55949: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/defaults/main.yml
  8303 1726773039.55974: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/meta/main.yml
  8303 1726773039.56052: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml
  8303 1726773039.56583: in VariableManager get_vars()
  8303 1726773039.56606: done with get_vars()
  8303 1726773039.58328: in VariableManager get_vars()
  8303 1726773039.58354: done with get_vars()
  8303 1726773039.58515: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/handlers/main.yml
  8303 1726773039.59188: iterating over new_blocks loaded from include file
  8303 1726773039.59190: in VariableManager get_vars()
  8303 1726773039.59228: done with get_vars()
  8303 1726773039.59230: filtering new block on tags
  8303 1726773039.59249: done filtering new block on tags
  8303 1726773039.59251: in VariableManager get_vars()
  8303 1726773039.59267: done with get_vars()
  8303 1726773039.59269: filtering new block on tags
  8303 1726773039.59291: done filtering new block on tags
  8303 1726773039.59293: in VariableManager get_vars()
  8303 1726773039.59309: done with get_vars()
  8303 1726773039.59311: filtering new block on tags
  8303 1726773039.59354: done filtering new block on tags
  8303 1726773039.59357: in VariableManager get_vars()
  8303 1726773039.59373: done with get_vars()
  8303 1726773039.59375: filtering new block on tags
  8303 1726773039.59394: done filtering new block on tags
  8303 1726773039.59396: done iterating over new_blocks loaded from include file
  8303 1726773039.59397: extending task lists for all hosts with included blocks
  8303 1726773039.59687: done extending task lists
  8303 1726773039.59688: done processing included files
  8303 1726773039.59689: results queue empty
  8303 1726773039.59689: checking for any_errors_fatal
  8303 1726773039.59694: done checking for any_errors_fatal
  8303 1726773039.59695: checking for max_fail_percentage
  8303 1726773039.59696: done checking for max_fail_percentage
  8303 1726773039.59697: checking to see if all hosts have failed and the running result is not ok
  8303 1726773039.59697: done checking to see if all hosts have failed
  8303 1726773039.59698: getting the remaining hosts for this loop
  8303 1726773039.59699: done getting the remaining hosts for this loop
  8303 1726773039.59701: getting the next task for host managed_node3
  8303 1726773039.59705: done getting next task for host managed_node3
  8303 1726773039.59708:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values
  8303 1726773039.59711:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773039.59720: getting variables
  8303 1726773039.59721: in VariableManager get_vars()
  8303 1726773039.59733: Calling all_inventory to load vars for managed_node3
  8303 1726773039.59736: Calling groups_inventory to load vars for managed_node3
  8303 1726773039.59738: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773039.59744: Calling all_plugins_play to load vars for managed_node3
  8303 1726773039.59746: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773039.59749: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773039.59787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773039.59824: done with get_vars()
  8303 1726773039.59831: done getting variables
  8303 1726773039.59869: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:2
Thursday 19 September 2024  15:10:39 -0400 (0:00:00.062)       0:00:16.177 **** 
  8303 1726773039.59908: entering _queue_task() for managed_node3/fail
  8303 1726773039.60159: worker is 1 (out of 1 available)
  8303 1726773039.60174: exiting _queue_task() for managed_node3/fail
  8303 1726773039.60189: done queuing things up, now waiting for results queue to drain
  8303 1726773039.60190: waiting for pending results...
  8979 1726773039.60410: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values
  8979 1726773039.60545: in run() - task 0affffe7-6841-6cfb-81ae-0000000001f5
  8979 1726773039.60563: variable 'ansible_search_path' from source: unknown
  8979 1726773039.60568: variable 'ansible_search_path' from source: unknown
  8979 1726773039.60603: calling self._execute()
  8979 1726773039.60668: variable 'ansible_host' from source: host vars for 'managed_node3'
  8979 1726773039.60678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8979 1726773039.60690: variable 'omit' from source: magic vars
  8979 1726773039.61119: variable 'kernel_settings_sysctl' from source: include params
  8979 1726773039.61131: variable '__kernel_settings_state_empty' from source: role '' all vars
  8979 1726773039.61145: Evaluated conditional (kernel_settings_sysctl != __kernel_settings_state_empty): True
  8979 1726773039.61500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name
  8979 1726773039.63761: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py
  8979 1726773039.63825: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py
  8979 1726773039.63862: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py
  8979 1726773039.63909: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py
  8979 1726773039.63935: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py
  8979 1726773039.64010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  8979 1726773039.64039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  8979 1726773039.64064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  8979 1726773039.64106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  8979 1726773039.64121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  8979 1726773039.64173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  8979 1726773039.64199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  8979 1726773039.64224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  8979 1726773039.64262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  8979 1726773039.64279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  8979 1726773039.64324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  8979 1726773039.64347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  8979 1726773039.64369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  8979 1726773039.64408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  8979 1726773039.64421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  8979 1726773039.64787: variable 'kernel_settings_sysctl' from source: include params
  8979 1726773039.64815: Evaluated conditional ((kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", true) | list | length > 0) or (kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", false) | list | length > 0)): False
  8979 1726773039.64821: when evaluation is False, skipping this task
  8979 1726773039.64825: _execute() done
  8979 1726773039.64828: dumping result to json
  8979 1726773039.64832: done dumping result, returning
  8979 1726773039.64839: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values [0affffe7-6841-6cfb-81ae-0000000001f5]
  8979 1726773039.64845: sending task result for task 0affffe7-6841-6cfb-81ae-0000000001f5
  8979 1726773039.64876: done sending task result for task 0affffe7-6841-6cfb-81ae-0000000001f5
  8979 1726773039.64880: WORKER PROCESS EXITING
skipping: [managed_node3] => {
    "changed": false,
    "false_condition": "(kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", true) | list | length > 0) or (kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", false) | list | length > 0)",
    "skip_reason": "Conditional result was False"
}
  8303 1726773039.65532: no more pending results, returning what we have
  8303 1726773039.65537: results queue empty
  8303 1726773039.65537: checking for any_errors_fatal
  8303 1726773039.65539: done checking for any_errors_fatal
  8303 1726773039.65540: checking for max_fail_percentage
  8303 1726773039.65541: done checking for max_fail_percentage
  8303 1726773039.65542: checking to see if all hosts have failed and the running result is not ok
  8303 1726773039.65542: done checking to see if all hosts have failed
  8303 1726773039.65543: getting the remaining hosts for this loop
  8303 1726773039.65544: done getting the remaining hosts for this loop
  8303 1726773039.65547: getting the next task for host managed_node3
  8303 1726773039.65554: done getting next task for host managed_node3
  8303 1726773039.65558:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables
  8303 1726773039.65561:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773039.65581: getting variables
  8303 1726773039.65582: in VariableManager get_vars()
  8303 1726773039.65618: Calling all_inventory to load vars for managed_node3
  8303 1726773039.65621: Calling groups_inventory to load vars for managed_node3
  8303 1726773039.65623: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773039.65632: Calling all_plugins_play to load vars for managed_node3
  8303 1726773039.65634: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773039.65637: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773039.65691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773039.65740: done with get_vars()
  8303 1726773039.65748: done getting variables

TASK [fedora.linux_system_roles.kernel_settings : Set version specific variables] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:9
Thursday 19 September 2024  15:10:39 -0400 (0:00:00.059)       0:00:16.237 **** 
  8303 1726773039.65842: entering _queue_task() for managed_node3/include_tasks
  8303 1726773039.66057: worker is 1 (out of 1 available)
  8303 1726773039.66071: exiting _queue_task() for managed_node3/include_tasks
  8303 1726773039.66083: done queuing things up, now waiting for results queue to drain
  8303 1726773039.66084: waiting for pending results...
  8981 1726773039.66995: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables
  8981 1726773039.67139: in run() - task 0affffe7-6841-6cfb-81ae-0000000001f6
  8981 1726773039.67159: variable 'ansible_search_path' from source: unknown
  8981 1726773039.67164: variable 'ansible_search_path' from source: unknown
  8981 1726773039.67200: calling self._execute()
  8981 1726773039.67270: variable 'ansible_host' from source: host vars for 'managed_node3'
  8981 1726773039.67281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8981 1726773039.67292: variable 'omit' from source: magic vars
  8981 1726773039.67389: _execute() done
  8981 1726773039.67395: dumping result to json
  8981 1726773039.67399: done dumping result, returning
  8981 1726773039.67405: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables [0affffe7-6841-6cfb-81ae-0000000001f6]
  8981 1726773039.67412: sending task result for task 0affffe7-6841-6cfb-81ae-0000000001f6
  8981 1726773039.67440: done sending task result for task 0affffe7-6841-6cfb-81ae-0000000001f6
  8981 1726773039.67444: WORKER PROCESS EXITING
  8303 1726773039.67778: no more pending results, returning what we have
  8303 1726773039.67782: in VariableManager get_vars()
  8303 1726773039.67820: Calling all_inventory to load vars for managed_node3
  8303 1726773039.67823: Calling groups_inventory to load vars for managed_node3
  8303 1726773039.67825: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773039.67834: Calling all_plugins_play to load vars for managed_node3
  8303 1726773039.67837: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773039.67839: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773039.67892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773039.67934: done with get_vars()
  8303 1726773039.67941: variable 'ansible_search_path' from source: unknown
  8303 1726773039.67942: variable 'ansible_search_path' from source: unknown
  8303 1726773039.67975: we have included files to process
  8303 1726773039.67976: generating all_blocks data
  8303 1726773039.67978: done generating all_blocks data
  8303 1726773039.67983: processing included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml
  8303 1726773039.67986: loading included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml
  8303 1726773039.67989: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml
included: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml for managed_node3
  8303 1726773039.68709: done processing included file
  8303 1726773039.68712: iterating over new_blocks loaded from include file
  8303 1726773039.68713: in VariableManager get_vars()
  8303 1726773039.68740: done with get_vars()
  8303 1726773039.68742: filtering new block on tags
  8303 1726773039.68758: done filtering new block on tags
  8303 1726773039.68760: in VariableManager get_vars()
  8303 1726773039.68783: done with get_vars()
  8303 1726773039.68784: filtering new block on tags
  8303 1726773039.68816: done filtering new block on tags
  8303 1726773039.68818: in VariableManager get_vars()
  8303 1726773039.68840: done with get_vars()
  8303 1726773039.68842: filtering new block on tags
  8303 1726773039.68860: done filtering new block on tags
  8303 1726773039.68862: in VariableManager get_vars()
  8303 1726773039.68884: done with get_vars()
  8303 1726773039.68889: filtering new block on tags
  8303 1726773039.68905: done filtering new block on tags
  8303 1726773039.68907: done iterating over new_blocks loaded from include file
  8303 1726773039.68908: extending task lists for all hosts with included blocks
  8303 1726773039.69128: done extending task lists
  8303 1726773039.69129: done processing included files
  8303 1726773039.69130: results queue empty
  8303 1726773039.69131: checking for any_errors_fatal
  8303 1726773039.69134: done checking for any_errors_fatal
  8303 1726773039.69134: checking for max_fail_percentage
  8303 1726773039.69135: done checking for max_fail_percentage
  8303 1726773039.69136: checking to see if all hosts have failed and the running result is not ok
  8303 1726773039.69137: done checking to see if all hosts have failed
  8303 1726773039.69138: getting the remaining hosts for this loop
  8303 1726773039.69139: done getting the remaining hosts for this loop
  8303 1726773039.69141: getting the next task for host managed_node3
  8303 1726773039.69145: done getting next task for host managed_node3
  8303 1726773039.69148:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role
  8303 1726773039.69150:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773039.69160: getting variables
  8303 1726773039.69161: in VariableManager get_vars()
  8303 1726773039.69173: Calling all_inventory to load vars for managed_node3
  8303 1726773039.69175: Calling groups_inventory to load vars for managed_node3
  8303 1726773039.69178: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773039.69183: Calling all_plugins_play to load vars for managed_node3
  8303 1726773039.69187: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773039.69190: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773039.69224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773039.69264: done with get_vars()
  8303 1726773039.69271: done getting variables

TASK [fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:2
Thursday 19 September 2024  15:10:39 -0400 (0:00:00.034)       0:00:16.272 **** 
  8303 1726773039.69340: entering _queue_task() for managed_node3/setup
  8303 1726773039.69583: worker is 1 (out of 1 available)
  8303 1726773039.69598: exiting _queue_task() for managed_node3/setup
  8303 1726773039.69610: done queuing things up, now waiting for results queue to drain
  8303 1726773039.69613: waiting for pending results...
  8982 1726773039.69858: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role
  8982 1726773039.70015: in run() - task 0affffe7-6841-6cfb-81ae-000000000271
  8982 1726773039.70035: variable 'ansible_search_path' from source: unknown
  8982 1726773039.70039: variable 'ansible_search_path' from source: unknown
  8982 1726773039.70073: calling self._execute()
  8982 1726773039.70142: variable 'ansible_host' from source: host vars for 'managed_node3'
  8982 1726773039.70152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8982 1726773039.70162: variable 'omit' from source: magic vars
  8982 1726773039.70915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name
  8982 1726773039.73117: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py
  8982 1726773039.73395: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py
  8982 1726773039.73435: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py
  8982 1726773039.73468: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py
  8982 1726773039.73494: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py
  8982 1726773039.73566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  8982 1726773039.73597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  8982 1726773039.73622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  8982 1726773039.73664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  8982 1726773039.73679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  8982 1726773039.73733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  8982 1726773039.73757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  8982 1726773039.73779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  8982 1726773039.73819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  8982 1726773039.73832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  8982 1726773039.74003: variable '__kernel_settings_required_facts' from source: role '' all vars
  8982 1726773039.74015: variable 'ansible_facts' from source: unknown
  8982 1726773039.74050: Evaluated conditional (__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0): False
  8982 1726773039.74056: when evaluation is False, skipping this task
  8982 1726773039.74059: _execute() done
  8982 1726773039.74062: dumping result to json
  8982 1726773039.74065: done dumping result, returning
  8982 1726773039.74072: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role [0affffe7-6841-6cfb-81ae-000000000271]
  8982 1726773039.74079: sending task result for task 0affffe7-6841-6cfb-81ae-000000000271
  8982 1726773039.74118: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000271
  8982 1726773039.74122: WORKER PROCESS EXITING
skipping: [managed_node3] => {
    "changed": false,
    "false_condition": "__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0",
    "skip_reason": "Conditional result was False"
}
  8303 1726773039.74552: no more pending results, returning what we have
  8303 1726773039.74556: results queue empty
  8303 1726773039.74556: checking for any_errors_fatal
  8303 1726773039.74558: done checking for any_errors_fatal
  8303 1726773039.74559: checking for max_fail_percentage
  8303 1726773039.74560: done checking for max_fail_percentage
  8303 1726773039.74561: checking to see if all hosts have failed and the running result is not ok
  8303 1726773039.74562: done checking to see if all hosts have failed
  8303 1726773039.74562: getting the remaining hosts for this loop
  8303 1726773039.74563: done getting the remaining hosts for this loop
  8303 1726773039.74566: getting the next task for host managed_node3
  8303 1726773039.74575: done getting next task for host managed_node3
  8303 1726773039.74579:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree
  8303 1726773039.74583:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773039.74598: getting variables
  8303 1726773039.74599: in VariableManager get_vars()
  8303 1726773039.74633: Calling all_inventory to load vars for managed_node3
  8303 1726773039.74636: Calling groups_inventory to load vars for managed_node3
  8303 1726773039.74638: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773039.74646: Calling all_plugins_play to load vars for managed_node3
  8303 1726773039.74649: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773039.74652: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773039.74703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773039.74754: done with get_vars()
  8303 1726773039.74762: done getting variables

TASK [fedora.linux_system_roles.kernel_settings : Check if system is ostree] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:10
Thursday 19 September 2024  15:10:39 -0400 (0:00:00.055)       0:00:16.327 **** 
  8303 1726773039.74861: entering _queue_task() for managed_node3/stat
  8303 1726773039.75058: worker is 1 (out of 1 available)
  8303 1726773039.75072: exiting _queue_task() for managed_node3/stat
  8303 1726773039.75083: done queuing things up, now waiting for results queue to drain
  8303 1726773039.75087: waiting for pending results...
  8984 1726773039.75298: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree
  8984 1726773039.75447: in run() - task 0affffe7-6841-6cfb-81ae-000000000273
  8984 1726773039.75465: variable 'ansible_search_path' from source: unknown
  8984 1726773039.75469: variable 'ansible_search_path' from source: unknown
  8984 1726773039.75504: calling self._execute()
  8984 1726773039.75647: variable 'ansible_host' from source: host vars for 'managed_node3'
  8984 1726773039.75658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8984 1726773039.75667: variable 'omit' from source: magic vars
  8984 1726773039.76081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name
  8984 1726773039.76347: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py
  8984 1726773039.76390: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py
  8984 1726773039.76420: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py
  8984 1726773039.76452: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py
  8984 1726773039.76527: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False)
  8984 1726773039.76550: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False)
  8984 1726773039.76573: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False)
  8984 1726773039.76603: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False)
  8984 1726773039.76718: variable '__kernel_settings_is_ostree' from source: set_fact
  8984 1726773039.76731: Evaluated conditional (not __kernel_settings_is_ostree is defined): False
  8984 1726773039.76735: when evaluation is False, skipping this task
  8984 1726773039.76740: _execute() done
  8984 1726773039.76743: dumping result to json
  8984 1726773039.76746: done dumping result, returning
  8984 1726773039.76753: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree [0affffe7-6841-6cfb-81ae-000000000273]
  8984 1726773039.76759: sending task result for task 0affffe7-6841-6cfb-81ae-000000000273
  8984 1726773039.76791: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000273
  8984 1726773039.76795: WORKER PROCESS EXITING
skipping: [managed_node3] => {
    "changed": false,
    "false_condition": "not __kernel_settings_is_ostree is defined",
    "skip_reason": "Conditional result was False"
}
  8303 1726773039.77226: no more pending results, returning what we have
  8303 1726773039.77229: results queue empty
  8303 1726773039.77230: checking for any_errors_fatal
  8303 1726773039.77234: done checking for any_errors_fatal
  8303 1726773039.77234: checking for max_fail_percentage
  8303 1726773039.77236: done checking for max_fail_percentage
  8303 1726773039.77236: checking to see if all hosts have failed and the running result is not ok
  8303 1726773039.77237: done checking to see if all hosts have failed
  8303 1726773039.77238: getting the remaining hosts for this loop
  8303 1726773039.77239: done getting the remaining hosts for this loop
  8303 1726773039.77242: getting the next task for host managed_node3
  8303 1726773039.77248: done getting next task for host managed_node3
  8303 1726773039.77251:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree
  8303 1726773039.77255:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773039.77267: getting variables
  8303 1726773039.77268: in VariableManager get_vars()
  8303 1726773039.77296: Calling all_inventory to load vars for managed_node3
  8303 1726773039.77299: Calling groups_inventory to load vars for managed_node3
  8303 1726773039.77301: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773039.77309: Calling all_plugins_play to load vars for managed_node3
  8303 1726773039.77311: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773039.77314: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773039.77362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773039.77409: done with get_vars()
  8303 1726773039.77417: done getting variables
  8303 1726773039.77469: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:15
Thursday 19 September 2024  15:10:39 -0400 (0:00:00.026)       0:00:16.353 **** 
  8303 1726773039.77507: entering _queue_task() for managed_node3/set_fact
  8303 1726773039.77718: worker is 1 (out of 1 available)
  8303 1726773039.77731: exiting _queue_task() for managed_node3/set_fact
  8303 1726773039.77743: done queuing things up, now waiting for results queue to drain
  8303 1726773039.77745: waiting for pending results...
  8986 1726773039.77965: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree
  8986 1726773039.78126: in run() - task 0affffe7-6841-6cfb-81ae-000000000274
  8986 1726773039.78144: variable 'ansible_search_path' from source: unknown
  8986 1726773039.78148: variable 'ansible_search_path' from source: unknown
  8986 1726773039.78180: calling self._execute()
  8986 1726773039.78248: variable 'ansible_host' from source: host vars for 'managed_node3'
  8986 1726773039.78257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8986 1726773039.78265: variable 'omit' from source: magic vars
  8986 1726773039.78713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name
  8986 1726773039.78994: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py
  8986 1726773039.79037: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py
  8986 1726773039.79071: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py
  8986 1726773039.79107: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py
  8986 1726773039.79182: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False)
  8986 1726773039.79209: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False)
  8986 1726773039.79233: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False)
  8986 1726773039.79258: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False)
  8986 1726773039.79369: variable '__kernel_settings_is_ostree' from source: set_fact
  8986 1726773039.79382: Evaluated conditional (not __kernel_settings_is_ostree is defined): False
  8986 1726773039.79388: when evaluation is False, skipping this task
  8986 1726773039.79391: _execute() done
  8986 1726773039.79394: dumping result to json
  8986 1726773039.79397: done dumping result, returning
  8986 1726773039.79403: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree [0affffe7-6841-6cfb-81ae-000000000274]
  8986 1726773039.79409: sending task result for task 0affffe7-6841-6cfb-81ae-000000000274
  8986 1726773039.79437: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000274
  8986 1726773039.79440: WORKER PROCESS EXITING
skipping: [managed_node3] => {
    "changed": false,
    "false_condition": "not __kernel_settings_is_ostree is defined",
    "skip_reason": "Conditional result was False"
}
  8303 1726773039.79790: no more pending results, returning what we have
  8303 1726773039.79793: results queue empty
  8303 1726773039.79794: checking for any_errors_fatal
  8303 1726773039.79798: done checking for any_errors_fatal
  8303 1726773039.79798: checking for max_fail_percentage
  8303 1726773039.79800: done checking for max_fail_percentage
  8303 1726773039.79801: checking to see if all hosts have failed and the running result is not ok
  8303 1726773039.79801: done checking to see if all hosts have failed
  8303 1726773039.79802: getting the remaining hosts for this loop
  8303 1726773039.79804: done getting the remaining hosts for this loop
  8303 1726773039.79807: getting the next task for host managed_node3
  8303 1726773039.79815: done getting next task for host managed_node3
  8303 1726773039.79819:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin
  8303 1726773039.79823:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773039.79837: getting variables
  8303 1726773039.79838: in VariableManager get_vars()
  8303 1726773039.79870: Calling all_inventory to load vars for managed_node3
  8303 1726773039.79873: Calling groups_inventory to load vars for managed_node3
  8303 1726773039.79875: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773039.79884: Calling all_plugins_play to load vars for managed_node3
  8303 1726773039.79888: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773039.79891: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773039.79940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773039.79993: done with get_vars()
  8303 1726773039.80001: done getting variables

TASK [fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:22
Thursday 19 September 2024  15:10:39 -0400 (0:00:00.025)       0:00:16.379 **** 
  8303 1726773039.80097: entering _queue_task() for managed_node3/stat
  8303 1726773039.80298: worker is 1 (out of 1 available)
  8303 1726773039.80312: exiting _queue_task() for managed_node3/stat
  8303 1726773039.80323: done queuing things up, now waiting for results queue to drain
  8303 1726773039.80325: waiting for pending results...
  8987 1726773039.80542: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin
  8987 1726773039.80700: in run() - task 0affffe7-6841-6cfb-81ae-000000000276
  8987 1726773039.80718: variable 'ansible_search_path' from source: unknown
  8987 1726773039.80722: variable 'ansible_search_path' from source: unknown
  8987 1726773039.80755: calling self._execute()
  8987 1726773039.80826: variable 'ansible_host' from source: host vars for 'managed_node3'
  8987 1726773039.80835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8987 1726773039.80843: variable 'omit' from source: magic vars
  8987 1726773039.81278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name
  8987 1726773039.81559: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py
  8987 1726773039.81602: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py
  8987 1726773039.81635: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py
  8987 1726773039.81713: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py
  8987 1726773039.81791: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False)
  8987 1726773039.81816: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False)
  8987 1726773039.81841: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False)
  8987 1726773039.81865: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False)
  8987 1726773039.81978: variable '__kernel_settings_is_transactional' from source: set_fact
  8987 1726773039.81992: Evaluated conditional (not __kernel_settings_is_transactional is defined): False
  8987 1726773039.81997: when evaluation is False, skipping this task
  8987 1726773039.82000: _execute() done
  8987 1726773039.82003: dumping result to json
  8987 1726773039.82006: done dumping result, returning
  8987 1726773039.82012: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin [0affffe7-6841-6cfb-81ae-000000000276]
  8987 1726773039.82018: sending task result for task 0affffe7-6841-6cfb-81ae-000000000276
  8987 1726773039.82046: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000276
  8987 1726773039.82049: WORKER PROCESS EXITING
skipping: [managed_node3] => {
    "changed": false,
    "false_condition": "not __kernel_settings_is_transactional is defined",
    "skip_reason": "Conditional result was False"
}
  8303 1726773039.82404: no more pending results, returning what we have
  8303 1726773039.82407: results queue empty
  8303 1726773039.82408: checking for any_errors_fatal
  8303 1726773039.82414: done checking for any_errors_fatal
  8303 1726773039.82415: checking for max_fail_percentage
  8303 1726773039.82416: done checking for max_fail_percentage
  8303 1726773039.82417: checking to see if all hosts have failed and the running result is not ok
  8303 1726773039.82417: done checking to see if all hosts have failed
  8303 1726773039.82418: getting the remaining hosts for this loop
  8303 1726773039.82419: done getting the remaining hosts for this loop
  8303 1726773039.82423: getting the next task for host managed_node3
  8303 1726773039.82429: done getting next task for host managed_node3
  8303 1726773039.82432:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists
  8303 1726773039.82436:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773039.82450: getting variables
  8303 1726773039.82451: in VariableManager get_vars()
  8303 1726773039.82483: Calling all_inventory to load vars for managed_node3
  8303 1726773039.82487: Calling groups_inventory to load vars for managed_node3
  8303 1726773039.82489: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773039.82497: Calling all_plugins_play to load vars for managed_node3
  8303 1726773039.82500: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773039.82502: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773039.82552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773039.82603: done with get_vars()
  8303 1726773039.82612: done getting variables
  8303 1726773039.82665: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:27
Thursday 19 September 2024  15:10:39 -0400 (0:00:00.026)       0:00:16.405 **** 
  8303 1726773039.82700: entering _queue_task() for managed_node3/set_fact
  8303 1726773039.82896: worker is 1 (out of 1 available)
  8303 1726773039.82910: exiting _queue_task() for managed_node3/set_fact
  8303 1726773039.82923: done queuing things up, now waiting for results queue to drain
  8303 1726773039.82924: waiting for pending results...
  8988 1726773039.83131: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists
  8988 1726773039.83277: in run() - task 0affffe7-6841-6cfb-81ae-000000000277
  8988 1726773039.83295: variable 'ansible_search_path' from source: unknown
  8988 1726773039.83299: variable 'ansible_search_path' from source: unknown
  8988 1726773039.83330: calling self._execute()
  8988 1726773039.83399: variable 'ansible_host' from source: host vars for 'managed_node3'
  8988 1726773039.83409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8988 1726773039.83418: variable 'omit' from source: magic vars
  8988 1726773039.83857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name
  8988 1726773039.84176: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py
  8988 1726773039.84220: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py
  8988 1726773039.84250: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py
  8988 1726773039.84281: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py
  8988 1726773039.84358: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False)
  8988 1726773039.84382: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False)
  8988 1726773039.84409: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False)
  8988 1726773039.84433: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False)
  8988 1726773039.84541: variable '__kernel_settings_is_transactional' from source: set_fact
  8988 1726773039.84553: Evaluated conditional (not __kernel_settings_is_transactional is defined): False
  8988 1726773039.84558: when evaluation is False, skipping this task
  8988 1726773039.84561: _execute() done
  8988 1726773039.84564: dumping result to json
  8988 1726773039.84567: done dumping result, returning
  8988 1726773039.84572: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists [0affffe7-6841-6cfb-81ae-000000000277]
  8988 1726773039.84579: sending task result for task 0affffe7-6841-6cfb-81ae-000000000277
  8988 1726773039.84608: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000277
  8988 1726773039.84611: WORKER PROCESS EXITING
skipping: [managed_node3] => {
    "changed": false,
    "false_condition": "not __kernel_settings_is_transactional is defined",
    "skip_reason": "Conditional result was False"
}
  8303 1726773039.84942: no more pending results, returning what we have
  8303 1726773039.84945: results queue empty
  8303 1726773039.84946: checking for any_errors_fatal
  8303 1726773039.84951: done checking for any_errors_fatal
  8303 1726773039.84951: checking for max_fail_percentage
  8303 1726773039.84953: done checking for max_fail_percentage
  8303 1726773039.84953: checking to see if all hosts have failed and the running result is not ok
  8303 1726773039.84954: done checking to see if all hosts have failed
  8303 1726773039.84955: getting the remaining hosts for this loop
  8303 1726773039.84956: done getting the remaining hosts for this loop
  8303 1726773039.84959: getting the next task for host managed_node3
  8303 1726773039.84967: done getting next task for host managed_node3
  8303 1726773039.84971:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables
  8303 1726773039.84975:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773039.84991: getting variables
  8303 1726773039.84993: in VariableManager get_vars()
  8303 1726773039.85023: Calling all_inventory to load vars for managed_node3
  8303 1726773039.85025: Calling groups_inventory to load vars for managed_node3
  8303 1726773039.85027: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773039.85035: Calling all_plugins_play to load vars for managed_node3
  8303 1726773039.85038: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773039.85040: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773039.85091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773039.85143: done with get_vars()
  8303 1726773039.85150: done getting variables
  8303 1726773039.85204: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Set platform/version specific variables] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:31
Thursday 19 September 2024  15:10:39 -0400 (0:00:00.025)       0:00:16.430 **** 
  8303 1726773039.85238: entering _queue_task() for managed_node3/include_vars
  8303 1726773039.85432: worker is 1 (out of 1 available)
  8303 1726773039.85445: exiting _queue_task() for managed_node3/include_vars
  8303 1726773039.85458: done queuing things up, now waiting for results queue to drain
  8303 1726773039.85458: waiting for pending results...
  8989 1726773039.85664: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables
  8989 1726773039.85808: in run() - task 0affffe7-6841-6cfb-81ae-000000000279
  8989 1726773039.85826: variable 'ansible_search_path' from source: unknown
  8989 1726773039.85831: variable 'ansible_search_path' from source: unknown
  8989 1726773039.85862: calling self._execute()
  8989 1726773039.85934: variable 'ansible_host' from source: host vars for 'managed_node3'
  8989 1726773039.85945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8989 1726773039.85953: variable 'omit' from source: magic vars
  8989 1726773039.86044: variable 'omit' from source: magic vars
  8989 1726773039.86116: variable 'omit' from source: magic vars
  8989 1726773039.86567: variable 'ffparams' from source: task vars
  8989 1726773039.86699: variable 'ansible_facts' from source: unknown
  8989 1726773039.86832: variable 'ansible_facts' from source: unknown
  8989 1726773039.86916: variable 'ansible_facts' from source: unknown
  8989 1726773039.87002: variable 'ansible_facts' from source: unknown
  8989 1726773039.87070: variable 'role_path' from source: magic vars
  8989 1726773039.87224: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup
  8989 1726773039.87414: Loaded config def from plugin (lookup/first_found)
  8989 1726773039.87423: Loading LookupModule 'first_found' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/first_found.py
  8989 1726773039.87456: variable 'ansible_search_path' from source: unknown
  8989 1726773039.87476: variable 'ansible_search_path' from source: unknown
  8989 1726773039.87484: variable 'ansible_search_path' from source: unknown
  8989 1726773039.87493: variable 'ansible_search_path' from source: unknown
  8989 1726773039.87500: variable 'ansible_search_path' from source: unknown
  8989 1726773039.87517: variable 'omit' from source: magic vars
  8989 1726773039.87540: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  8989 1726773039.87561: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  8989 1726773039.87577: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  8989 1726773039.87635: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8989 1726773039.87646: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8989 1726773039.87670: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  8989 1726773039.87675: variable 'ansible_host' from source: host vars for 'managed_node3'
  8989 1726773039.87679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8989 1726773039.87771: Set connection var ansible_pipelining to False
  8989 1726773039.87782: Set connection var ansible_timeout to 10
  8989 1726773039.87790: Set connection var ansible_module_compression to ZIP_DEFLATED
  8989 1726773039.87795: Set connection var ansible_shell_executable to /bin/sh
  8989 1726773039.87798: Set connection var ansible_connection to ssh
  8989 1726773039.87805: Set connection var ansible_shell_type to sh
  8989 1726773039.87825: variable 'ansible_shell_executable' from source: unknown
  8989 1726773039.87829: variable 'ansible_connection' from source: unknown
  8989 1726773039.87832: variable 'ansible_module_compression' from source: unknown
  8989 1726773039.87835: variable 'ansible_shell_type' from source: unknown
  8989 1726773039.87837: variable 'ansible_shell_executable' from source: unknown
  8989 1726773039.87840: variable 'ansible_host' from source: host vars for 'managed_node3'
  8989 1726773039.87844: variable 'ansible_pipelining' from source: unknown
  8989 1726773039.87847: variable 'ansible_timeout' from source: unknown
  8989 1726773039.87850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8989 1726773039.87939: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  8989 1726773039.87950: variable 'omit' from source: magic vars
  8989 1726773039.87956: starting attempt loop
  8989 1726773039.87959: running the handler
  8989 1726773039.88009: handler run complete
  8989 1726773039.88019: attempt loop complete, returning result
  8989 1726773039.88023: _execute() done
  8989 1726773039.88026: dumping result to json
  8989 1726773039.88030: done dumping result, returning
  8989 1726773039.88036: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables [0affffe7-6841-6cfb-81ae-000000000279]
  8989 1726773039.88042: sending task result for task 0affffe7-6841-6cfb-81ae-000000000279
  8989 1726773039.88072: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000279
  8989 1726773039.88075: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "ansible_facts": {
        "__kernel_settings_packages": [
            "tuned",
            "python3-configobj"
        ],
        "__kernel_settings_services": [
            "tuned"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/default.yml"
    ],
    "changed": false
}
  8303 1726773039.88477: no more pending results, returning what we have
  8303 1726773039.88480: results queue empty
  8303 1726773039.88481: checking for any_errors_fatal
  8303 1726773039.88484: done checking for any_errors_fatal
  8303 1726773039.88487: checking for max_fail_percentage
  8303 1726773039.88489: done checking for max_fail_percentage
  8303 1726773039.88489: checking to see if all hosts have failed and the running result is not ok
  8303 1726773039.88490: done checking to see if all hosts have failed
  8303 1726773039.88490: getting the remaining hosts for this loop
  8303 1726773039.88491: done getting the remaining hosts for this loop
  8303 1726773039.88494: getting the next task for host managed_node3
  8303 1726773039.88502: done getting next task for host managed_node3
  8303 1726773039.88505:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed
  8303 1726773039.88509:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773039.88519: getting variables
  8303 1726773039.88520: in VariableManager get_vars()
  8303 1726773039.88549: Calling all_inventory to load vars for managed_node3
  8303 1726773039.88552: Calling groups_inventory to load vars for managed_node3
  8303 1726773039.88554: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773039.88567: Calling all_plugins_play to load vars for managed_node3
  8303 1726773039.88570: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773039.88573: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773039.88621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773039.88665: done with get_vars()
  8303 1726773039.88672: done getting variables
  8303 1726773039.88724: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Ensure required packages are installed] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12
Thursday 19 September 2024  15:10:39 -0400 (0:00:00.035)       0:00:16.466 **** 
  8303 1726773039.88755: entering _queue_task() for managed_node3/package
  8303 1726773039.88943: worker is 1 (out of 1 available)
  8303 1726773039.88957: exiting _queue_task() for managed_node3/package
  8303 1726773039.88969: done queuing things up, now waiting for results queue to drain
  8303 1726773039.88969: waiting for pending results...
  8990 1726773039.89177: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed
  8990 1726773039.89309: in run() - task 0affffe7-6841-6cfb-81ae-0000000001f7
  8990 1726773039.89326: variable 'ansible_search_path' from source: unknown
  8990 1726773039.89330: variable 'ansible_search_path' from source: unknown
  8990 1726773039.89360: calling self._execute()
  8990 1726773039.89504: variable 'ansible_host' from source: host vars for 'managed_node3'
  8990 1726773039.89513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8990 1726773039.89521: variable 'omit' from source: magic vars
  8990 1726773039.89612: variable 'omit' from source: magic vars
  8990 1726773039.89663: variable 'omit' from source: magic vars
  8990 1726773039.89690: variable '__kernel_settings_packages' from source: include_vars
  8990 1726773039.89949: variable '__kernel_settings_packages' from source: include_vars
  8990 1726773039.90200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name
  8990 1726773039.92551: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py
  8990 1726773039.92618: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py
  8990 1726773039.92656: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py
  8990 1726773039.92704: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py
  8990 1726773039.92729: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py
  8990 1726773039.92820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  8990 1726773039.92847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  8990 1726773039.92870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  8990 1726773039.92911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  8990 1726773039.92925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  8990 1726773039.93023: variable '__kernel_settings_is_ostree' from source: set_fact
  8990 1726773039.93031: variable 'omit' from source: magic vars
  8990 1726773039.93062: variable 'omit' from source: magic vars
  8990 1726773039.93090: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  8990 1726773039.93115: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  8990 1726773039.93132: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  8990 1726773039.93150: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8990 1726773039.93161: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  8990 1726773039.93190: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  8990 1726773039.93200: variable 'ansible_host' from source: host vars for 'managed_node3'
  8990 1726773039.93205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8990 1726773039.93299: Set connection var ansible_pipelining to False
  8990 1726773039.93310: Set connection var ansible_timeout to 10
  8990 1726773039.93317: Set connection var ansible_module_compression to ZIP_DEFLATED
  8990 1726773039.93323: Set connection var ansible_shell_executable to /bin/sh
  8990 1726773039.93326: Set connection var ansible_connection to ssh
  8990 1726773039.93334: Set connection var ansible_shell_type to sh
  8990 1726773039.93355: variable 'ansible_shell_executable' from source: unknown
  8990 1726773039.93360: variable 'ansible_connection' from source: unknown
  8990 1726773039.93363: variable 'ansible_module_compression' from source: unknown
  8990 1726773039.93366: variable 'ansible_shell_type' from source: unknown
  8990 1726773039.93368: variable 'ansible_shell_executable' from source: unknown
  8990 1726773039.93372: variable 'ansible_host' from source: host vars for 'managed_node3'
  8990 1726773039.93375: variable 'ansible_pipelining' from source: unknown
  8990 1726773039.93378: variable 'ansible_timeout' from source: unknown
  8990 1726773039.93382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  8990 1726773039.93540: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  8990 1726773039.93553: variable 'omit' from source: magic vars
  8990 1726773039.93559: starting attempt loop
  8990 1726773039.93562: running the handler
  8990 1726773039.93654: variable 'ansible_facts' from source: unknown
  8990 1726773039.93691: _low_level_execute_command(): starting
  8990 1726773039.93700: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  8990 1726773039.96479: stdout chunk (state=2):
>>>/root
<<<
  8990 1726773039.96493: stderr chunk (state=2):
>>><<<
  8990 1726773039.96507: stdout chunk (state=3):
>>><<<
  8990 1726773039.96522: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  8990 1726773039.96537: _low_level_execute_command(): starting
  8990 1726773039.96546: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773039.9653125-8990-224480132918353 `" && echo ansible-tmp-1726773039.9653125-8990-224480132918353="` echo /root/.ansible/tmp/ansible-tmp-1726773039.9653125-8990-224480132918353 `" ) && sleep 0'
  8990 1726773040.00408: stdout chunk (state=2):
>>>ansible-tmp-1726773039.9653125-8990-224480132918353=/root/.ansible/tmp/ansible-tmp-1726773039.9653125-8990-224480132918353
<<<
  8990 1726773040.00471: stderr chunk (state=3):
>>><<<
  8990 1726773040.00479: stdout chunk (state=3):
>>><<<
  8990 1726773040.00503: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773039.9653125-8990-224480132918353=/root/.ansible/tmp/ansible-tmp-1726773039.9653125-8990-224480132918353
, stderr=
  8990 1726773040.00532: variable 'ansible_module_compression' from source: unknown
  8990 1726773040.00578: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED
  8990 1726773040.00639: variable 'ansible_facts' from source: unknown
  8990 1726773040.00877: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773039.9653125-8990-224480132918353/AnsiballZ_setup.py
  8990 1726773040.01359: Sending initial data
  8990 1726773040.01366: Sent initial data (152 bytes)
  8990 1726773040.04015: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpicgkfbqk /root/.ansible/tmp/ansible-tmp-1726773039.9653125-8990-224480132918353/AnsiballZ_setup.py
<<<
  8990 1726773040.07316: stderr chunk (state=3):
>>><<<
  8990 1726773040.07328: stdout chunk (state=3):
>>><<<
  8990 1726773040.07353: done transferring module to remote
  8990 1726773040.07366: _low_level_execute_command(): starting
  8990 1726773040.07372: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773039.9653125-8990-224480132918353/ /root/.ansible/tmp/ansible-tmp-1726773039.9653125-8990-224480132918353/AnsiballZ_setup.py && sleep 0'
  8990 1726773040.10176: stderr chunk (state=2):
>>><<<
  8990 1726773040.10193: stdout chunk (state=2):
>>><<<
  8990 1726773040.10211: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8990 1726773040.10215: _low_level_execute_command(): starting
  8990 1726773040.10218: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773039.9653125-8990-224480132918353/AnsiballZ_setup.py && sleep 0'
  8990 1726773040.39199: stdout chunk (state=2):
>>>
{"ansible_facts": {"ansible_pkg_mgr": "dnf"}, "invocation": {"module_args": {"filter": ["ansible_pkg_mgr"], "gather_subset": ["!all"], "gather_timeout": 10, "fact_path": "/etc/ansible/facts.d"}}}
<<<
  8990 1726773040.41027: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  8990 1726773040.41038: stdout chunk (state=3):
>>><<<
  8990 1726773040.41048: stderr chunk (state=3):
>>><<<
  8990 1726773040.41060: _low_level_execute_command() done: rc=0, stdout=
{"ansible_facts": {"ansible_pkg_mgr": "dnf"}, "invocation": {"module_args": {"filter": ["ansible_pkg_mgr"], "gather_subset": ["!all"], "gather_timeout": 10, "fact_path": "/etc/ansible/facts.d"}}}
, stderr=Shared connection to 10.31.47.99 closed.
  8990 1726773040.41095: done with _execute_module (ansible.legacy.setup, {'filter': 'ansible_pkg_mgr', 'gather_subset': '!all', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773039.9653125-8990-224480132918353/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  8990 1726773040.41116: Facts {'ansible_facts': {'ansible_pkg_mgr': 'dnf'}, 'invocation': {'module_args': {'filter': ['ansible_pkg_mgr'], 'gather_subset': ['!all'], 'gather_timeout': 10, 'fact_path': '/etc/ansible/facts.d'}}, '_ansible_parsed': True}
  8990 1726773040.41189: variable 'ansible_module_compression' from source: unknown
  8990 1726773040.41238: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED
  8990 1726773040.41276: variable 'ansible_facts' from source: unknown
  8990 1726773040.41411: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773039.9653125-8990-224480132918353/AnsiballZ_dnf.py
  8990 1726773040.43090: Sending initial data
  8990 1726773040.43101: Sent initial data (150 bytes)
  8990 1726773040.47872: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpaq46mlog /root/.ansible/tmp/ansible-tmp-1726773039.9653125-8990-224480132918353/AnsiballZ_dnf.py
<<<
  8990 1726773040.50776: stderr chunk (state=3):
>>><<<
  8990 1726773040.50790: stdout chunk (state=3):
>>><<<
  8990 1726773040.50816: done transferring module to remote
  8990 1726773040.50828: _low_level_execute_command(): starting
  8990 1726773040.50834: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773039.9653125-8990-224480132918353/ /root/.ansible/tmp/ansible-tmp-1726773039.9653125-8990-224480132918353/AnsiballZ_dnf.py && sleep 0'
  8990 1726773040.54178: stderr chunk (state=2):
>>><<<
  8990 1726773040.54191: stdout chunk (state=2):
>>><<<
  8990 1726773040.54211: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8990 1726773040.54216: _low_level_execute_command(): starting
  8990 1726773040.54220: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773039.9653125-8990-224480132918353/AnsiballZ_dnf.py && sleep 0'
  8990 1726773043.09251: stdout chunk (state=2):
>>>
{"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}}
<<<
  8990 1726773043.18165: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  8990 1726773043.18179: stdout chunk (state=3):
>>><<<
  8990 1726773043.18192: stderr chunk (state=3):
>>><<<
  8990 1726773043.18207: _low_level_execute_command() done: rc=0, stdout=
{"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}}
, stderr=Shared connection to 10.31.47.99 closed.
  8990 1726773043.18246: done with _execute_module (ansible.legacy.dnf, {'name': ['tuned', 'python3-configobj'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773039.9653125-8990-224480132918353/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  8990 1726773043.18254: _low_level_execute_command(): starting
  8990 1726773043.18260: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773039.9653125-8990-224480132918353/ > /dev/null 2>&1 && sleep 0'
  8990 1726773043.22279: stderr chunk (state=2):
>>><<<
  8990 1726773043.22293: stdout chunk (state=2):
>>><<<
  8990 1726773043.22310: _low_level_execute_command() done: rc=0, stdout=, stderr=
  8990 1726773043.22318: handler run complete
  8990 1726773043.22358: attempt loop complete, returning result
  8990 1726773043.22363: _execute() done
  8990 1726773043.22367: dumping result to json
  8990 1726773043.22373: done dumping result, returning
  8990 1726773043.22382: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed [0affffe7-6841-6cfb-81ae-0000000001f7]
  8990 1726773043.22390: sending task result for task 0affffe7-6841-6cfb-81ae-0000000001f7
  8990 1726773043.22428: done sending task result for task 0affffe7-6841-6cfb-81ae-0000000001f7
  8990 1726773043.22432: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "changed": false,
    "rc": 0,
    "results": []
}

MSG:

Nothing to do
  8303 1726773043.23882: no more pending results, returning what we have
  8303 1726773043.23888: results queue empty
  8303 1726773043.23889: checking for any_errors_fatal
  8303 1726773043.23894: done checking for any_errors_fatal
  8303 1726773043.23894: checking for max_fail_percentage
  8303 1726773043.23897: done checking for max_fail_percentage
  8303 1726773043.23898: checking to see if all hosts have failed and the running result is not ok
  8303 1726773043.23898: done checking to see if all hosts have failed
  8303 1726773043.23899: getting the remaining hosts for this loop
  8303 1726773043.23900: done getting the remaining hosts for this loop
  8303 1726773043.23903: getting the next task for host managed_node3
  8303 1726773043.23911: done getting next task for host managed_node3
  8303 1726773043.23914:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes
  8303 1726773043.23917:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773043.23928: getting variables
  8303 1726773043.23929: in VariableManager get_vars()
  8303 1726773043.23957: Calling all_inventory to load vars for managed_node3
  8303 1726773043.23960: Calling groups_inventory to load vars for managed_node3
  8303 1726773043.23962: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773043.23973: Calling all_plugins_play to load vars for managed_node3
  8303 1726773043.23976: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773043.23979: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773043.24030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773043.24077: done with get_vars()
  8303 1726773043.24084: done getting variables
  8303 1726773043.24135: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:24
Thursday 19 September 2024  15:10:43 -0400 (0:00:03.354)       0:00:19.820 **** 
  8303 1726773043.24163: entering _queue_task() for managed_node3/debug
  8303 1726773043.24370: worker is 1 (out of 1 available)
  8303 1726773043.24383: exiting _queue_task() for managed_node3/debug
  8303 1726773043.24397: done queuing things up, now waiting for results queue to drain
  8303 1726773043.24398: waiting for pending results...
  9212 1726773043.24791: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes
  9212 1726773043.24920: in run() - task 0affffe7-6841-6cfb-81ae-0000000001f9
  9212 1726773043.24938: variable 'ansible_search_path' from source: unknown
  9212 1726773043.24942: variable 'ansible_search_path' from source: unknown
  9212 1726773043.24972: calling self._execute()
  9212 1726773043.25037: variable 'ansible_host' from source: host vars for 'managed_node3'
  9212 1726773043.25048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9212 1726773043.25057: variable 'omit' from source: magic vars
  9212 1726773043.25501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name
  9212 1726773043.27961: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py
  9212 1726773043.28057: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py
  9212 1726773043.28097: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py
  9212 1726773043.28130: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py
  9212 1726773043.28155: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py
  9212 1726773043.28228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  9212 1726773043.28257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  9212 1726773043.28282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  9212 1726773043.28323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  9212 1726773043.28340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  9212 1726773043.28437: variable '__kernel_settings_is_transactional' from source: set_fact
  9212 1726773043.28455: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False
  9212 1726773043.28460: when evaluation is False, skipping this task
  9212 1726773043.28464: _execute() done
  9212 1726773043.28467: dumping result to json
  9212 1726773043.28470: done dumping result, returning
  9212 1726773043.28477: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes [0affffe7-6841-6cfb-81ae-0000000001f9]
  9212 1726773043.28483: sending task result for task 0affffe7-6841-6cfb-81ae-0000000001f9
  9212 1726773043.28515: done sending task result for task 0affffe7-6841-6cfb-81ae-0000000001f9
  9212 1726773043.28517: WORKER PROCESS EXITING
skipping: [managed_node3] => {
    "false_condition": "__kernel_settings_is_transactional | d(false)"
}
  8303 1726773043.28914: no more pending results, returning what we have
  8303 1726773043.28918: results queue empty
  8303 1726773043.28918: checking for any_errors_fatal
  8303 1726773043.28927: done checking for any_errors_fatal
  8303 1726773043.28928: checking for max_fail_percentage
  8303 1726773043.28930: done checking for max_fail_percentage
  8303 1726773043.28930: checking to see if all hosts have failed and the running result is not ok
  8303 1726773043.28931: done checking to see if all hosts have failed
  8303 1726773043.28932: getting the remaining hosts for this loop
  8303 1726773043.28933: done getting the remaining hosts for this loop
  8303 1726773043.28936: getting the next task for host managed_node3
  8303 1726773043.28942: done getting next task for host managed_node3
  8303 1726773043.28946:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems
  8303 1726773043.28950:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773043.28964: getting variables
  8303 1726773043.28965: in VariableManager get_vars()
  8303 1726773043.29001: Calling all_inventory to load vars for managed_node3
  8303 1726773043.29004: Calling groups_inventory to load vars for managed_node3
  8303 1726773043.29007: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773043.29015: Calling all_plugins_play to load vars for managed_node3
  8303 1726773043.29018: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773043.29021: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773043.29073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773043.29122: done with get_vars()
  8303 1726773043.29131: done getting variables
  8303 1726773043.29187: Loading ActionModule 'reboot' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Reboot transactional update systems] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:29
Thursday 19 September 2024  15:10:43 -0400 (0:00:00.050)       0:00:19.870 **** 
  8303 1726773043.29220: entering _queue_task() for managed_node3/reboot
  8303 1726773043.29423: worker is 1 (out of 1 available)
  8303 1726773043.29436: exiting _queue_task() for managed_node3/reboot
  8303 1726773043.29448: done queuing things up, now waiting for results queue to drain
  8303 1726773043.29450: waiting for pending results...
  9214 1726773043.29744: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems
  9214 1726773043.29875: in run() - task 0affffe7-6841-6cfb-81ae-0000000001fa
  9214 1726773043.29894: variable 'ansible_search_path' from source: unknown
  9214 1726773043.29899: variable 'ansible_search_path' from source: unknown
  9214 1726773043.29932: calling self._execute()
  9214 1726773043.30003: variable 'ansible_host' from source: host vars for 'managed_node3'
  9214 1726773043.30014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9214 1726773043.30023: variable 'omit' from source: magic vars
  9214 1726773043.30470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name
  9214 1726773043.32991: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py
  9214 1726773043.33059: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py
  9214 1726773043.33098: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py
  9214 1726773043.33132: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py
  9214 1726773043.33158: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py
  9214 1726773043.33233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  9214 1726773043.33262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  9214 1726773043.33290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  9214 1726773043.33329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  9214 1726773043.33340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  9214 1726773043.33430: variable '__kernel_settings_is_transactional' from source: set_fact
  9214 1726773043.33447: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False
  9214 1726773043.33451: when evaluation is False, skipping this task
  9214 1726773043.33454: _execute() done
  9214 1726773043.33456: dumping result to json
  9214 1726773043.33459: done dumping result, returning
  9214 1726773043.33464: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems [0affffe7-6841-6cfb-81ae-0000000001fa]
  9214 1726773043.33470: sending task result for task 0affffe7-6841-6cfb-81ae-0000000001fa
  9214 1726773043.33497: done sending task result for task 0affffe7-6841-6cfb-81ae-0000000001fa
  9214 1726773043.33500: WORKER PROCESS EXITING
skipping: [managed_node3] => {
    "changed": false,
    "false_condition": "__kernel_settings_is_transactional | d(false)",
    "skip_reason": "Conditional result was False"
}
  8303 1726773043.33915: no more pending results, returning what we have
  8303 1726773043.33918: results queue empty
  8303 1726773043.33918: checking for any_errors_fatal
  8303 1726773043.33924: done checking for any_errors_fatal
  8303 1726773043.33925: checking for max_fail_percentage
  8303 1726773043.33926: done checking for max_fail_percentage
  8303 1726773043.33927: checking to see if all hosts have failed and the running result is not ok
  8303 1726773043.33927: done checking to see if all hosts have failed
  8303 1726773043.33928: getting the remaining hosts for this loop
  8303 1726773043.33929: done getting the remaining hosts for this loop
  8303 1726773043.33932: getting the next task for host managed_node3
  8303 1726773043.33938: done getting next task for host managed_node3
  8303 1726773043.33941:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set
  8303 1726773043.33944:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773043.33957: getting variables
  8303 1726773043.33958: in VariableManager get_vars()
  8303 1726773043.33988: Calling all_inventory to load vars for managed_node3
  8303 1726773043.33990: Calling groups_inventory to load vars for managed_node3
  8303 1726773043.33993: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773043.34001: Calling all_plugins_play to load vars for managed_node3
  8303 1726773043.34003: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773043.34006: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773043.34056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773043.34106: done with get_vars()
  8303 1726773043.34115: done getting variables
  8303 1726773043.34167: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:34
Thursday 19 September 2024  15:10:43 -0400 (0:00:00.049)       0:00:19.920 **** 
  8303 1726773043.34201: entering _queue_task() for managed_node3/fail
  8303 1726773043.34403: worker is 1 (out of 1 available)
  8303 1726773043.34418: exiting _queue_task() for managed_node3/fail
  8303 1726773043.34430: done queuing things up, now waiting for results queue to drain
  8303 1726773043.34431: waiting for pending results...
  9215 1726773043.34639: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set
  9215 1726773043.34776: in run() - task 0affffe7-6841-6cfb-81ae-0000000001fb
  9215 1726773043.34795: variable 'ansible_search_path' from source: unknown
  9215 1726773043.34801: variable 'ansible_search_path' from source: unknown
  9215 1726773043.34833: calling self._execute()
  9215 1726773043.34900: variable 'ansible_host' from source: host vars for 'managed_node3'
  9215 1726773043.34909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9215 1726773043.34918: variable 'omit' from source: magic vars
  9215 1726773043.35345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name
  9215 1726773043.37800: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py
  9215 1726773043.37865: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py
  9215 1726773043.37903: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py
  9215 1726773043.37938: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py
  9215 1726773043.37962: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py
  9215 1726773043.38033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  9215 1726773043.38061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  9215 1726773043.38087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  9215 1726773043.38128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  9215 1726773043.38144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  9215 1726773043.38243: variable '__kernel_settings_is_transactional' from source: set_fact
  9215 1726773043.38261: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False
  9215 1726773043.38266: when evaluation is False, skipping this task
  9215 1726773043.38269: _execute() done
  9215 1726773043.38272: dumping result to json
  9215 1726773043.38276: done dumping result, returning
  9215 1726773043.38282: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set [0affffe7-6841-6cfb-81ae-0000000001fb]
  9215 1726773043.38290: sending task result for task 0affffe7-6841-6cfb-81ae-0000000001fb
  9215 1726773043.38321: done sending task result for task 0affffe7-6841-6cfb-81ae-0000000001fb
  9215 1726773043.38324: WORKER PROCESS EXITING
skipping: [managed_node3] => {
    "changed": false,
    "false_condition": "__kernel_settings_is_transactional | d(false)",
    "skip_reason": "Conditional result was False"
}
  8303 1726773043.38715: no more pending results, returning what we have
  8303 1726773043.38718: results queue empty
  8303 1726773043.38719: checking for any_errors_fatal
  8303 1726773043.38724: done checking for any_errors_fatal
  8303 1726773043.38725: checking for max_fail_percentage
  8303 1726773043.38727: done checking for max_fail_percentage
  8303 1726773043.38727: checking to see if all hosts have failed and the running result is not ok
  8303 1726773043.38728: done checking to see if all hosts have failed
  8303 1726773043.38729: getting the remaining hosts for this loop
  8303 1726773043.38730: done getting the remaining hosts for this loop
  8303 1726773043.38733: getting the next task for host managed_node3
  8303 1726773043.38744: done getting next task for host managed_node3
  8303 1726773043.38748:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config
  8303 1726773043.38751:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773043.38766: getting variables
  8303 1726773043.38767: in VariableManager get_vars()
  8303 1726773043.38806: Calling all_inventory to load vars for managed_node3
  8303 1726773043.38809: Calling groups_inventory to load vars for managed_node3
  8303 1726773043.38812: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773043.38821: Calling all_plugins_play to load vars for managed_node3
  8303 1726773043.38823: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773043.38826: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773043.38878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773043.38928: done with get_vars()
  8303 1726773043.38936: done getting variables

TASK [fedora.linux_system_roles.kernel_settings : Read tuned main config] ******
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42
Thursday 19 September 2024  15:10:43 -0400 (0:00:00.048)       0:00:19.968 **** 
  8303 1726773043.39022: entering _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config
  8303 1726773043.39230: worker is 1 (out of 1 available)
  8303 1726773043.39244: exiting _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config
  8303 1726773043.39256: done queuing things up, now waiting for results queue to drain
  8303 1726773043.39257: waiting for pending results...
  9217 1726773043.39479: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config
  9217 1726773043.39612: in run() - task 0affffe7-6841-6cfb-81ae-0000000001fd
  9217 1726773043.39630: variable 'ansible_search_path' from source: unknown
  9217 1726773043.39635: variable 'ansible_search_path' from source: unknown
  9217 1726773043.39668: calling self._execute()
  9217 1726773043.39742: variable 'ansible_host' from source: host vars for 'managed_node3'
  9217 1726773043.39752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9217 1726773043.39761: variable 'omit' from source: magic vars
  9217 1726773043.39857: variable 'omit' from source: magic vars
  9217 1726773043.39911: variable 'omit' from source: magic vars
  9217 1726773043.39938: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars
  9217 1726773043.40209: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars
  9217 1726773043.40334: variable '__kernel_settings_tuned_dir' from source: role '' all vars
  9217 1726773043.40371: variable 'omit' from source: magic vars
  9217 1726773043.40412: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  9217 1726773043.40518: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  9217 1726773043.40539: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  9217 1726773043.40556: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9217 1726773043.40568: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9217 1726773043.40598: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  9217 1726773043.40604: variable 'ansible_host' from source: host vars for 'managed_node3'
  9217 1726773043.40608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9217 1726773043.40701: Set connection var ansible_pipelining to False
  9217 1726773043.40713: Set connection var ansible_timeout to 10
  9217 1726773043.40719: Set connection var ansible_module_compression to ZIP_DEFLATED
  9217 1726773043.40724: Set connection var ansible_shell_executable to /bin/sh
  9217 1726773043.40727: Set connection var ansible_connection to ssh
  9217 1726773043.40735: Set connection var ansible_shell_type to sh
  9217 1726773043.40756: variable 'ansible_shell_executable' from source: unknown
  9217 1726773043.40761: variable 'ansible_connection' from source: unknown
  9217 1726773043.40764: variable 'ansible_module_compression' from source: unknown
  9217 1726773043.40767: variable 'ansible_shell_type' from source: unknown
  9217 1726773043.40770: variable 'ansible_shell_executable' from source: unknown
  9217 1726773043.40772: variable 'ansible_host' from source: host vars for 'managed_node3'
  9217 1726773043.40776: variable 'ansible_pipelining' from source: unknown
  9217 1726773043.40779: variable 'ansible_timeout' from source: unknown
  9217 1726773043.40783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9217 1726773043.40950: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action)
  9217 1726773043.40963: variable 'omit' from source: magic vars
  9217 1726773043.40969: starting attempt loop
  9217 1726773043.40972: running the handler
  9217 1726773043.40984: _low_level_execute_command(): starting
  9217 1726773043.40994: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  9217 1726773043.43666: stdout chunk (state=2):
>>>/root
<<<
  9217 1726773043.43816: stderr chunk (state=3):
>>><<<
  9217 1726773043.43824: stdout chunk (state=3):
>>><<<
  9217 1726773043.43845: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  9217 1726773043.43859: _low_level_execute_command(): starting
  9217 1726773043.43865: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773043.438526-9217-98160856867018 `" && echo ansible-tmp-1726773043.438526-9217-98160856867018="` echo /root/.ansible/tmp/ansible-tmp-1726773043.438526-9217-98160856867018 `" ) && sleep 0'
  9217 1726773043.47122: stdout chunk (state=2):
>>>ansible-tmp-1726773043.438526-9217-98160856867018=/root/.ansible/tmp/ansible-tmp-1726773043.438526-9217-98160856867018
<<<
  9217 1726773043.47660: stderr chunk (state=3):
>>><<<
  9217 1726773043.47674: stdout chunk (state=3):
>>><<<
  9217 1726773043.47697: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773043.438526-9217-98160856867018=/root/.ansible/tmp/ansible-tmp-1726773043.438526-9217-98160856867018
, stderr=
  9217 1726773043.47746: variable 'ansible_module_compression' from source: unknown
  9217 1726773043.47788: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED
  9217 1726773043.47827: variable 'ansible_facts' from source: unknown
  9217 1726773043.47919: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773043.438526-9217-98160856867018/AnsiballZ_kernel_settings_get_config.py
  9217 1726773043.48860: Sending initial data
  9217 1726773043.48873: Sent initial data (171 bytes)
  9217 1726773043.54993: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmposxhy0_c /root/.ansible/tmp/ansible-tmp-1726773043.438526-9217-98160856867018/AnsiballZ_kernel_settings_get_config.py
<<<
  9217 1726773043.56693: stderr chunk (state=3):
>>><<<
  9217 1726773043.56705: stdout chunk (state=3):
>>><<<
  9217 1726773043.56729: done transferring module to remote
  9217 1726773043.56743: _low_level_execute_command(): starting
  9217 1726773043.56749: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773043.438526-9217-98160856867018/ /root/.ansible/tmp/ansible-tmp-1726773043.438526-9217-98160856867018/AnsiballZ_kernel_settings_get_config.py && sleep 0'
  9217 1726773043.60992: stderr chunk (state=2):
>>><<<
  9217 1726773043.61006: stdout chunk (state=2):
>>><<<
  9217 1726773043.61026: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9217 1726773043.61032: _low_level_execute_command(): starting
  9217 1726773043.61038: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773043.438526-9217-98160856867018/AnsiballZ_kernel_settings_get_config.py && sleep 0'
  9217 1726773043.77784: stdout chunk (state=2):
>>>
{"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}}
<<<
  9217 1726773043.79146: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  9217 1726773043.79157: stdout chunk (state=3):
>>><<<
  9217 1726773043.79170: stderr chunk (state=3):
>>><<<
  9217 1726773043.79183: _low_level_execute_command() done: rc=0, stdout=
{"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}}
, stderr=Shared connection to 10.31.47.99 closed.
  9217 1726773043.79219: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/tuned-main.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773043.438526-9217-98160856867018/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  9217 1726773043.79230: _low_level_execute_command(): starting
  9217 1726773043.79236: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773043.438526-9217-98160856867018/ > /dev/null 2>&1 && sleep 0'
  9217 1726773043.82058: stderr chunk (state=2):
>>><<<
  9217 1726773043.82072: stdout chunk (state=2):
>>><<<
  9217 1726773043.82086: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9217 1726773043.82091: handler run complete
  9217 1726773043.82104: attempt loop complete, returning result
  9217 1726773043.82106: _execute() done
  9217 1726773043.82108: dumping result to json
  9217 1726773043.82111: done dumping result, returning
  9217 1726773043.82116: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config [0affffe7-6841-6cfb-81ae-0000000001fd]
  9217 1726773043.82121: sending task result for task 0affffe7-6841-6cfb-81ae-0000000001fd
  9217 1726773043.82148: done sending task result for task 0affffe7-6841-6cfb-81ae-0000000001fd
  9217 1726773043.82151: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "changed": false,
    "data": {
        "daemon": "1",
        "default_instance_priority": "0",
        "dynamic_tuning": "0",
        "log_file_count": "2",
        "log_file_max_size": "1MB",
        "reapply_sysctl": "1",
        "recommend_command": "1",
        "sleep_interval": "1",
        "udev_buffer_size": "1MB",
        "update_interval": "10"
    }
}
  8303 1726773043.82348: no more pending results, returning what we have
  8303 1726773043.82352: results queue empty
  8303 1726773043.82352: checking for any_errors_fatal
  8303 1726773043.82357: done checking for any_errors_fatal
  8303 1726773043.82358: checking for max_fail_percentage
  8303 1726773043.82359: done checking for max_fail_percentage
  8303 1726773043.82360: checking to see if all hosts have failed and the running result is not ok
  8303 1726773043.82360: done checking to see if all hosts have failed
  8303 1726773043.82361: getting the remaining hosts for this loop
  8303 1726773043.82362: done getting the remaining hosts for this loop
  8303 1726773043.82365: getting the next task for host managed_node3
  8303 1726773043.82371: done getting next task for host managed_node3
  8303 1726773043.82373:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory
  8303 1726773043.82376:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773043.82388: getting variables
  8303 1726773043.82389: in VariableManager get_vars()
  8303 1726773043.82414: Calling all_inventory to load vars for managed_node3
  8303 1726773043.82416: Calling groups_inventory to load vars for managed_node3
  8303 1726773043.82417: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773043.82424: Calling all_plugins_play to load vars for managed_node3
  8303 1726773043.82425: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773043.82427: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773043.82461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773043.82496: done with get_vars()
  8303 1726773043.82503: done getting variables

TASK [fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50
Thursday 19 September 2024  15:10:43 -0400 (0:00:00.435)       0:00:20.404 **** 
  8303 1726773043.82564: entering _queue_task() for managed_node3/stat
  8303 1726773043.82722: worker is 1 (out of 1 available)
  8303 1726773043.82736: exiting _queue_task() for managed_node3/stat
  8303 1726773043.82748: done queuing things up, now waiting for results queue to drain
  8303 1726773043.82750: waiting for pending results...
  9253 1726773043.82874: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory
  9253 1726773043.82986: in run() - task 0affffe7-6841-6cfb-81ae-0000000001fe
  9253 1726773043.83003: variable 'ansible_search_path' from source: unknown
  9253 1726773043.83008: variable 'ansible_search_path' from source: unknown
  9253 1726773043.83046: variable '__prof_from_conf' from source: task vars
  9253 1726773043.83272: variable '__prof_from_conf' from source: task vars
  9253 1726773043.83486: variable '__data' from source: task vars
  9253 1726773043.83546: variable '__kernel_settings_register_tuned_main' from source: set_fact
  9253 1726773043.83718: variable '__kernel_settings_tuned_dir' from source: role '' all vars
  9253 1726773043.83729: variable '__kernel_settings_tuned_dir' from source: role '' all vars
  9253 1726773043.83798: variable '__kernel_settings_tuned_dir' from source: role '' all vars
  9253 1726773043.83818: variable 'omit' from source: magic vars
  9253 1726773043.83903: variable 'ansible_host' from source: host vars for 'managed_node3'
  9253 1726773043.83914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9253 1726773043.83923: variable 'omit' from source: magic vars
  9253 1726773043.84127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name
  9253 1726773043.86236: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py
  9253 1726773043.86299: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py
  9253 1726773043.86328: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py
  9253 1726773043.86355: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py
  9253 1726773043.86378: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py
  9253 1726773043.86436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  9253 1726773043.86458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  9253 1726773043.86479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  9253 1726773043.86511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  9253 1726773043.86524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  9253 1726773043.86595: variable 'item' from source: unknown
  9253 1726773043.86611: Evaluated conditional (item | length > 0): False
  9253 1726773043.86617: when evaluation is False, skipping this task
  9253 1726773043.86639: variable 'item' from source: unknown
  9253 1726773043.86688: variable 'item' from source: unknown
skipping: [managed_node3] => (item=)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "false_condition": "item | length > 0",
    "item": "",
    "skip_reason": "Conditional result was False"
}
  9253 1726773043.86762: variable 'ansible_host' from source: host vars for 'managed_node3'
  9253 1726773043.86772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9253 1726773043.86781: variable 'omit' from source: magic vars
  9253 1726773043.86907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  9253 1726773043.86925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  9253 1726773043.86943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  9253 1726773043.86969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  9253 1726773043.86981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  9253 1726773043.87033: variable 'item' from source: unknown
  9253 1726773043.87040: Evaluated conditional (item | length > 0): True
  9253 1726773043.87045: variable 'omit' from source: magic vars
  9253 1726773043.87077: variable 'omit' from source: magic vars
  9253 1726773043.87109: variable 'item' from source: unknown
  9253 1726773043.87153: variable 'item' from source: unknown
  9253 1726773043.87164: variable 'omit' from source: magic vars
  9253 1726773043.87189: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  9253 1726773043.87209: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  9253 1726773043.87222: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  9253 1726773043.87233: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9253 1726773043.87240: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9253 1726773043.87259: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  9253 1726773043.87262: variable 'ansible_host' from source: host vars for 'managed_node3'
  9253 1726773043.87264: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9253 1726773043.87351: Set connection var ansible_pipelining to False
  9253 1726773043.87371: Set connection var ansible_timeout to 10
  9253 1726773043.87380: Set connection var ansible_module_compression to ZIP_DEFLATED
  9253 1726773043.87388: Set connection var ansible_shell_executable to /bin/sh
  9253 1726773043.87391: Set connection var ansible_connection to ssh
  9253 1726773043.87398: Set connection var ansible_shell_type to sh
  9253 1726773043.87412: variable 'ansible_shell_executable' from source: unknown
  9253 1726773043.87415: variable 'ansible_connection' from source: unknown
  9253 1726773043.87418: variable 'ansible_module_compression' from source: unknown
  9253 1726773043.87422: variable 'ansible_shell_type' from source: unknown
  9253 1726773043.87425: variable 'ansible_shell_executable' from source: unknown
  9253 1726773043.87428: variable 'ansible_host' from source: host vars for 'managed_node3'
  9253 1726773043.87432: variable 'ansible_pipelining' from source: unknown
  9253 1726773043.87435: variable 'ansible_timeout' from source: unknown
  9253 1726773043.87439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9253 1726773043.87538: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action)
  9253 1726773043.87547: variable 'omit' from source: magic vars
  9253 1726773043.87552: starting attempt loop
  9253 1726773043.87554: running the handler
  9253 1726773043.87561: _low_level_execute_command(): starting
  9253 1726773043.87566: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  9253 1726773043.90114: stdout chunk (state=2):
>>>/root
<<<
  9253 1726773043.90233: stderr chunk (state=3):
>>><<<
  9253 1726773043.90241: stdout chunk (state=3):
>>><<<
  9253 1726773043.90259: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  9253 1726773043.90274: _low_level_execute_command(): starting
  9253 1726773043.90280: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773043.9026937-9253-191468491872179 `" && echo ansible-tmp-1726773043.9026937-9253-191468491872179="` echo /root/.ansible/tmp/ansible-tmp-1726773043.9026937-9253-191468491872179 `" ) && sleep 0'
  9253 1726773043.92944: stdout chunk (state=2):
>>>ansible-tmp-1726773043.9026937-9253-191468491872179=/root/.ansible/tmp/ansible-tmp-1726773043.9026937-9253-191468491872179
<<<
  9253 1726773043.93080: stderr chunk (state=3):
>>><<<
  9253 1726773043.93089: stdout chunk (state=3):
>>><<<
  9253 1726773043.93104: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773043.9026937-9253-191468491872179=/root/.ansible/tmp/ansible-tmp-1726773043.9026937-9253-191468491872179
, stderr=
  9253 1726773043.93141: variable 'ansible_module_compression' from source: unknown
  9253 1726773043.93181: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED
  9253 1726773043.93211: variable 'ansible_facts' from source: unknown
  9253 1726773043.93281: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773043.9026937-9253-191468491872179/AnsiballZ_stat.py
  9253 1726773043.93384: Sending initial data
  9253 1726773043.93392: Sent initial data (151 bytes)
  9253 1726773043.96035: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmp361sbjqf /root/.ansible/tmp/ansible-tmp-1726773043.9026937-9253-191468491872179/AnsiballZ_stat.py
<<<
  9253 1726773043.97218: stderr chunk (state=3):
>>><<<
  9253 1726773043.97226: stdout chunk (state=3):
>>><<<
  9253 1726773043.97245: done transferring module to remote
  9253 1726773043.97255: _low_level_execute_command(): starting
  9253 1726773043.97261: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773043.9026937-9253-191468491872179/ /root/.ansible/tmp/ansible-tmp-1726773043.9026937-9253-191468491872179/AnsiballZ_stat.py && sleep 0'
  9253 1726773043.99768: stderr chunk (state=2):
>>><<<
  9253 1726773043.99777: stdout chunk (state=2):
>>><<<
  9253 1726773043.99794: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9253 1726773043.99799: _low_level_execute_command(): starting
  9253 1726773043.99804: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773043.9026937-9253-191468491872179/AnsiballZ_stat.py && sleep 0'
  9253 1726773044.15597: stdout chunk (state=2):
>>>
{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}}
<<<
  9253 1726773044.16689: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  9253 1726773044.16703: stdout chunk (state=3):
>>><<<
  9253 1726773044.16717: stderr chunk (state=3):
>>><<<
  9253 1726773044.16728: _low_level_execute_command() done: rc=0, stdout=
{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}}
, stderr=Shared connection to 10.31.47.99 closed.
  9253 1726773044.16747: done with _execute_module (stat, {'path': '/etc/tuned/profiles', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773043.9026937-9253-191468491872179/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  9253 1726773044.16763: _low_level_execute_command(): starting
  9253 1726773044.16769: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773043.9026937-9253-191468491872179/ > /dev/null 2>&1 && sleep 0'
  9253 1726773044.19313: stderr chunk (state=2):
>>><<<
  9253 1726773044.19323: stdout chunk (state=2):
>>><<<
  9253 1726773044.19337: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9253 1726773044.19343: handler run complete
  9253 1726773044.19359: attempt loop complete, returning result
  9253 1726773044.19375: variable 'item' from source: unknown
  9253 1726773044.19457: variable 'item' from source: unknown
ok: [managed_node3] => (item=/etc/tuned/profiles) => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "/etc/tuned/profiles",
    "stat": {
        "exists": false
    }
}
  9253 1726773044.19549: variable 'ansible_host' from source: host vars for 'managed_node3'
  9253 1726773044.19559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9253 1726773044.19568: variable 'omit' from source: magic vars
  9253 1726773044.19721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  9253 1726773044.19750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  9253 1726773044.19777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  9253 1726773044.19819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  9253 1726773044.19836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  9253 1726773044.19917: variable 'item' from source: unknown
  9253 1726773044.19927: Evaluated conditional (item | length > 0): True
  9253 1726773044.19932: variable 'omit' from source: magic vars
  9253 1726773044.19948: variable 'omit' from source: magic vars
  9253 1726773044.19997: variable 'item' from source: unknown
  9253 1726773044.20060: variable 'item' from source: unknown
  9253 1726773044.20079: variable 'omit' from source: magic vars
  9253 1726773044.20100: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  9253 1726773044.20109: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9253 1726773044.20116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9253 1726773044.20129: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  9253 1726773044.20133: variable 'ansible_host' from source: host vars for 'managed_node3'
  9253 1726773044.20136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9253 1726773044.20212: Set connection var ansible_pipelining to False
  9253 1726773044.20222: Set connection var ansible_timeout to 10
  9253 1726773044.20228: Set connection var ansible_module_compression to ZIP_DEFLATED
  9253 1726773044.20237: Set connection var ansible_shell_executable to /bin/sh
  9253 1726773044.20240: Set connection var ansible_connection to ssh
  9253 1726773044.20247: Set connection var ansible_shell_type to sh
  9253 1726773044.20260: variable 'ansible_shell_executable' from source: unknown
  9253 1726773044.20264: variable 'ansible_connection' from source: unknown
  9253 1726773044.20275: variable 'ansible_module_compression' from source: unknown
  9253 1726773044.20279: variable 'ansible_shell_type' from source: unknown
  9253 1726773044.20281: variable 'ansible_shell_executable' from source: unknown
  9253 1726773044.20284: variable 'ansible_host' from source: host vars for 'managed_node3'
  9253 1726773044.20291: variable 'ansible_pipelining' from source: unknown
  9253 1726773044.20295: variable 'ansible_timeout' from source: unknown
  9253 1726773044.20299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9253 1726773044.20363: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  9253 1726773044.20376: variable 'omit' from source: magic vars
  9253 1726773044.20381: starting attempt loop
  9253 1726773044.20399: running the handler
  9253 1726773044.20411: _low_level_execute_command(): starting
  9253 1726773044.20419: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  9253 1726773044.22771: stdout chunk (state=2):
>>>/root
<<<
  9253 1726773044.23366: stderr chunk (state=3):
>>><<<
  9253 1726773044.23373: stdout chunk (state=3):
>>><<<
  9253 1726773044.23390: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  9253 1726773044.23402: _low_level_execute_command(): starting
  9253 1726773044.23407: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773044.2339811-9253-29916841274106 `" && echo ansible-tmp-1726773044.2339811-9253-29916841274106="` echo /root/.ansible/tmp/ansible-tmp-1726773044.2339811-9253-29916841274106 `" ) && sleep 0'
  9253 1726773044.26313: stdout chunk (state=2):
>>>ansible-tmp-1726773044.2339811-9253-29916841274106=/root/.ansible/tmp/ansible-tmp-1726773044.2339811-9253-29916841274106
<<<
  9253 1726773044.26395: stderr chunk (state=3):
>>><<<
  9253 1726773044.26402: stdout chunk (state=3):
>>><<<
  9253 1726773044.26416: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773044.2339811-9253-29916841274106=/root/.ansible/tmp/ansible-tmp-1726773044.2339811-9253-29916841274106
, stderr=
  9253 1726773044.26445: variable 'ansible_module_compression' from source: unknown
  9253 1726773044.26487: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED
  9253 1726773044.26508: variable 'ansible_facts' from source: unknown
  9253 1726773044.26570: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773044.2339811-9253-29916841274106/AnsiballZ_stat.py
  9253 1726773044.26904: Sending initial data
  9253 1726773044.26910: Sent initial data (150 bytes)
  9253 1726773044.29686: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmp47hrwvqr /root/.ansible/tmp/ansible-tmp-1726773044.2339811-9253-29916841274106/AnsiballZ_stat.py
<<<
  9253 1726773044.30896: stderr chunk (state=3):
>>><<<
  9253 1726773044.30905: stdout chunk (state=3):
>>><<<
  9253 1726773044.30924: done transferring module to remote
  9253 1726773044.30933: _low_level_execute_command(): starting
  9253 1726773044.30938: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773044.2339811-9253-29916841274106/ /root/.ansible/tmp/ansible-tmp-1726773044.2339811-9253-29916841274106/AnsiballZ_stat.py && sleep 0'
  9253 1726773044.33373: stderr chunk (state=2):
>>><<<
  9253 1726773044.33382: stdout chunk (state=2):
>>><<<
  9253 1726773044.33398: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9253 1726773044.33402: _low_level_execute_command(): starting
  9253 1726773044.33407: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773044.2339811-9253-29916841274106/AnsiballZ_stat.py && sleep 0'
  9253 1726773044.49511: stdout chunk (state=2):
>>>
{"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726773038.2860043, "mtime": 1726773036.0089955, "ctime": 1726773036.0089955, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}}
<<<
  9253 1726773044.50592: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  9253 1726773044.50604: stdout chunk (state=3):
>>><<<
  9253 1726773044.50615: stderr chunk (state=3):
>>><<<
  9253 1726773044.50630: _low_level_execute_command() done: rc=0, stdout=
{"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726773038.2860043, "mtime": 1726773036.0089955, "ctime": 1726773036.0089955, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}}
, stderr=Shared connection to 10.31.47.99 closed.
  9253 1726773044.50681: done with _execute_module (stat, {'path': '/etc/tuned', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773044.2339811-9253-29916841274106/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  9253 1726773044.50693: _low_level_execute_command(): starting
  9253 1726773044.50699: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773044.2339811-9253-29916841274106/ > /dev/null 2>&1 && sleep 0'
  9253 1726773044.55900: stderr chunk (state=2):
>>><<<
  9253 1726773044.55910: stdout chunk (state=2):
>>><<<
  9253 1726773044.55925: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9253 1726773044.55932: handler run complete
  9253 1726773044.55963: attempt loop complete, returning result
  9253 1726773044.55980: variable 'item' from source: unknown
  9253 1726773044.56042: variable 'item' from source: unknown
ok: [managed_node3] => (item=/etc/tuned) => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "/etc/tuned",
    "stat": {
        "atime": 1726773038.2860043,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 0,
        "charset": "binary",
        "ctime": 1726773036.0089955,
        "dev": 51713,
        "device_type": 0,
        "executable": true,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 917919,
        "isblk": false,
        "ischr": false,
        "isdir": true,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": false,
        "issock": false,
        "isuid": false,
        "mimetype": "inode/directory",
        "mode": "0755",
        "mtime": 1726773036.0089955,
        "nlink": 4,
        "path": "/etc/tuned",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": true,
        "rusr": true,
        "size": 159,
        "uid": 0,
        "version": "1785990601",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": true,
        "xoth": true,
        "xusr": true
    }
}
  9253 1726773044.56088: dumping result to json
  9253 1726773044.56098: done dumping result, returning
  9253 1726773044.56107: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory [0affffe7-6841-6cfb-81ae-0000000001fe]
  9253 1726773044.56113: sending task result for task 0affffe7-6841-6cfb-81ae-0000000001fe
  9253 1726773044.56150: done sending task result for task 0affffe7-6841-6cfb-81ae-0000000001fe
  9253 1726773044.56154: WORKER PROCESS EXITING
  8303 1726773044.56327: no more pending results, returning what we have
  8303 1726773044.56330: results queue empty
  8303 1726773044.56331: checking for any_errors_fatal
  8303 1726773044.56335: done checking for any_errors_fatal
  8303 1726773044.56336: checking for max_fail_percentage
  8303 1726773044.56338: done checking for max_fail_percentage
  8303 1726773044.56338: checking to see if all hosts have failed and the running result is not ok
  8303 1726773044.56339: done checking to see if all hosts have failed
  8303 1726773044.56339: getting the remaining hosts for this loop
  8303 1726773044.56340: done getting the remaining hosts for this loop
  8303 1726773044.56343: getting the next task for host managed_node3
  8303 1726773044.56349: done getting next task for host managed_node3
  8303 1726773044.56351:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir
  8303 1726773044.56355:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773044.56364: getting variables
  8303 1726773044.56365: in VariableManager get_vars()
  8303 1726773044.56400: Calling all_inventory to load vars for managed_node3
  8303 1726773044.56403: Calling groups_inventory to load vars for managed_node3
  8303 1726773044.56405: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773044.56413: Calling all_plugins_play to load vars for managed_node3
  8303 1726773044.56415: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773044.56418: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773044.56481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773044.56522: done with get_vars()
  8303 1726773044.56531: done getting variables
  8303 1726773044.56592: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:63
Thursday 19 September 2024  15:10:44 -0400 (0:00:00.740)       0:00:21.144 **** 
  8303 1726773044.56623: entering _queue_task() for managed_node3/set_fact
  8303 1726773044.56806: worker is 1 (out of 1 available)
  8303 1726773044.56820: exiting _queue_task() for managed_node3/set_fact
  8303 1726773044.56833: done queuing things up, now waiting for results queue to drain
  8303 1726773044.56834: waiting for pending results...
  9308 1726773044.56979: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir
  9308 1726773044.57114: in run() - task 0affffe7-6841-6cfb-81ae-0000000001ff
  9308 1726773044.57133: variable 'ansible_search_path' from source: unknown
  9308 1726773044.57138: variable 'ansible_search_path' from source: unknown
  9308 1726773044.57168: calling self._execute()
  9308 1726773044.57234: variable 'ansible_host' from source: host vars for 'managed_node3'
  9308 1726773044.57243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9308 1726773044.57252: variable 'omit' from source: magic vars
  9308 1726773044.57348: variable 'omit' from source: magic vars
  9308 1726773044.57404: variable 'omit' from source: magic vars
  9308 1726773044.57898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name
  9308 1726773044.60047: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py
  9308 1726773044.60115: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py
  9308 1726773044.60153: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py
  9308 1726773044.60189: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py
  9308 1726773044.60216: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py
  9308 1726773044.60291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  9308 1726773044.60321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  9308 1726773044.60348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  9308 1726773044.60390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  9308 1726773044.60405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  9308 1726773044.60452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  9308 1726773044.60477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  9308 1726773044.60504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  9308 1726773044.60544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  9308 1726773044.60559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  9308 1726773044.60632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  9308 1726773044.60655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  9308 1726773044.60679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  9308 1726773044.60719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  9308 1726773044.60732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  9308 1726773044.60949: variable '__kernel_settings_find_profile_dirs' from source: set_fact
  9308 1726773044.61047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name
  9308 1726773044.61248: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py
  9308 1726773044.61284: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py
  9308 1726773044.61321: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py
  9308 1726773044.61348: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py
  9308 1726773044.61387: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False)
  9308 1726773044.61408: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False)
  9308 1726773044.61431: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False)
  9308 1726773044.61456: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False)
  9308 1726773044.61506: variable 'omit' from source: magic vars
  9308 1726773044.61533: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  9308 1726773044.61559: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  9308 1726773044.61576: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  9308 1726773044.61594: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9308 1726773044.61605: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9308 1726773044.61631: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  9308 1726773044.61637: variable 'ansible_host' from source: host vars for 'managed_node3'
  9308 1726773044.61641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9308 1726773044.61735: Set connection var ansible_pipelining to False
  9308 1726773044.61747: Set connection var ansible_timeout to 10
  9308 1726773044.61753: Set connection var ansible_module_compression to ZIP_DEFLATED
  9308 1726773044.61759: Set connection var ansible_shell_executable to /bin/sh
  9308 1726773044.61762: Set connection var ansible_connection to ssh
  9308 1726773044.61768: Set connection var ansible_shell_type to sh
  9308 1726773044.61792: variable 'ansible_shell_executable' from source: unknown
  9308 1726773044.61798: variable 'ansible_connection' from source: unknown
  9308 1726773044.61801: variable 'ansible_module_compression' from source: unknown
  9308 1726773044.61804: variable 'ansible_shell_type' from source: unknown
  9308 1726773044.61807: variable 'ansible_shell_executable' from source: unknown
  9308 1726773044.61810: variable 'ansible_host' from source: host vars for 'managed_node3'
  9308 1726773044.61813: variable 'ansible_pipelining' from source: unknown
  9308 1726773044.61816: variable 'ansible_timeout' from source: unknown
  9308 1726773044.61820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9308 1726773044.61908: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  9308 1726773044.61920: variable 'omit' from source: magic vars
  9308 1726773044.61926: starting attempt loop
  9308 1726773044.61929: running the handler
  9308 1726773044.61939: handler run complete
  9308 1726773044.61947: attempt loop complete, returning result
  9308 1726773044.61950: _execute() done
  9308 1726773044.61953: dumping result to json
  9308 1726773044.61956: done dumping result, returning
  9308 1726773044.61963: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir [0affffe7-6841-6cfb-81ae-0000000001ff]
  9308 1726773044.61969: sending task result for task 0affffe7-6841-6cfb-81ae-0000000001ff
  9308 1726773044.61994: done sending task result for task 0affffe7-6841-6cfb-81ae-0000000001ff
  9308 1726773044.61998: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "ansible_facts": {
        "__kernel_settings_profile_parent": "/etc/tuned"
    },
    "changed": false
}
  8303 1726773044.62525: no more pending results, returning what we have
  8303 1726773044.62528: results queue empty
  8303 1726773044.62530: checking for any_errors_fatal
  8303 1726773044.62537: done checking for any_errors_fatal
  8303 1726773044.62538: checking for max_fail_percentage
  8303 1726773044.62540: done checking for max_fail_percentage
  8303 1726773044.62540: checking to see if all hosts have failed and the running result is not ok
  8303 1726773044.62541: done checking to see if all hosts have failed
  8303 1726773044.62541: getting the remaining hosts for this loop
  8303 1726773044.62542: done getting the remaining hosts for this loop
  8303 1726773044.62545: getting the next task for host managed_node3
  8303 1726773044.62552: done getting next task for host managed_node3
  8303 1726773044.62555:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started
  8303 1726773044.62558:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773044.62568: getting variables
  8303 1726773044.62569: in VariableManager get_vars()
  8303 1726773044.62604: Calling all_inventory to load vars for managed_node3
  8303 1726773044.62607: Calling groups_inventory to load vars for managed_node3
  8303 1726773044.62609: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773044.62619: Calling all_plugins_play to load vars for managed_node3
  8303 1726773044.62621: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773044.62624: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773044.62672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773044.62721: done with get_vars()
  8303 1726773044.62730: done getting variables
  8303 1726773044.62788: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67
Thursday 19 September 2024  15:10:44 -0400 (0:00:00.061)       0:00:21.206 **** 
  8303 1726773044.62819: entering _queue_task() for managed_node3/service
  8303 1726773044.63018: worker is 1 (out of 1 available)
  8303 1726773044.63031: exiting _queue_task() for managed_node3/service
  8303 1726773044.63043: done queuing things up, now waiting for results queue to drain
  8303 1726773044.63044: waiting for pending results...
  9311 1726773044.63278: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started
  9311 1726773044.63415: in run() - task 0affffe7-6841-6cfb-81ae-000000000200
  9311 1726773044.63433: variable 'ansible_search_path' from source: unknown
  9311 1726773044.63437: variable 'ansible_search_path' from source: unknown
  9311 1726773044.63479: variable '__kernel_settings_services' from source: include_vars
  9311 1726773044.63763: variable '__kernel_settings_services' from source: include_vars
  9311 1726773044.63950: variable 'omit' from source: magic vars
  9311 1726773044.64039: variable 'ansible_host' from source: host vars for 'managed_node3'
  9311 1726773044.64051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9311 1726773044.64060: variable 'omit' from source: magic vars
  9311 1726773044.64441: variable 'omit' from source: magic vars
  9311 1726773044.64493: variable 'omit' from source: magic vars
  9311 1726773044.64540: variable 'item' from source: unknown
  9311 1726773044.64615: variable 'item' from source: unknown
  9311 1726773044.64641: variable 'omit' from source: magic vars
  9311 1726773044.64680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  9311 1726773044.64714: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  9311 1726773044.64735: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  9311 1726773044.64753: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9311 1726773044.64766: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9311 1726773044.64795: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  9311 1726773044.64801: variable 'ansible_host' from source: host vars for 'managed_node3'
  9311 1726773044.64805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9311 1726773044.64902: Set connection var ansible_pipelining to False
  9311 1726773044.64914: Set connection var ansible_timeout to 10
  9311 1726773044.64920: Set connection var ansible_module_compression to ZIP_DEFLATED
  9311 1726773044.64927: Set connection var ansible_shell_executable to /bin/sh
  9311 1726773044.64930: Set connection var ansible_connection to ssh
  9311 1726773044.64938: Set connection var ansible_shell_type to sh
  9311 1726773044.64956: variable 'ansible_shell_executable' from source: unknown
  9311 1726773044.64960: variable 'ansible_connection' from source: unknown
  9311 1726773044.64963: variable 'ansible_module_compression' from source: unknown
  9311 1726773044.64966: variable 'ansible_shell_type' from source: unknown
  9311 1726773044.64969: variable 'ansible_shell_executable' from source: unknown
  9311 1726773044.64971: variable 'ansible_host' from source: host vars for 'managed_node3'
  9311 1726773044.64975: variable 'ansible_pipelining' from source: unknown
  9311 1726773044.64978: variable 'ansible_timeout' from source: unknown
  9311 1726773044.64981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9311 1726773044.65106: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  9311 1726773044.65118: variable 'omit' from source: magic vars
  9311 1726773044.65124: starting attempt loop
  9311 1726773044.65128: running the handler
  9311 1726773044.65213: variable 'ansible_facts' from source: unknown
  9311 1726773044.65250: _low_level_execute_command(): starting
  9311 1726773044.65260: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  9311 1726773044.69802: stdout chunk (state=2):
>>>/root
<<<
  9311 1726773044.69936: stderr chunk (state=3):
>>><<<
  9311 1726773044.69947: stdout chunk (state=3):
>>><<<
  9311 1726773044.69975: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  9311 1726773044.69993: _low_level_execute_command(): starting
  9311 1726773044.70001: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773044.699862-9311-2755908257431 `" && echo ansible-tmp-1726773044.699862-9311-2755908257431="` echo /root/.ansible/tmp/ansible-tmp-1726773044.699862-9311-2755908257431 `" ) && sleep 0'
  9311 1726773044.72831: stdout chunk (state=2):
>>>ansible-tmp-1726773044.699862-9311-2755908257431=/root/.ansible/tmp/ansible-tmp-1726773044.699862-9311-2755908257431
<<<
  9311 1726773044.72989: stderr chunk (state=3):
>>><<<
  9311 1726773044.72999: stdout chunk (state=3):
>>><<<
  9311 1726773044.73017: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773044.699862-9311-2755908257431=/root/.ansible/tmp/ansible-tmp-1726773044.699862-9311-2755908257431
, stderr=
  9311 1726773044.73046: variable 'ansible_module_compression' from source: unknown
  9311 1726773044.73099: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED
  9311 1726773044.73157: variable 'ansible_facts' from source: unknown
  9311 1726773044.73384: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773044.699862-9311-2755908257431/AnsiballZ_setup.py
  9311 1726773044.74156: Sending initial data
  9311 1726773044.74164: Sent initial data (149 bytes)
  9311 1726773044.76993: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmp24f1abj2 /root/.ansible/tmp/ansible-tmp-1726773044.699862-9311-2755908257431/AnsiballZ_setup.py
<<<
  9311 1726773044.79856: stderr chunk (state=3):
>>><<<
  9311 1726773044.79867: stdout chunk (state=3):
>>><<<
  9311 1726773044.79891: done transferring module to remote
  9311 1726773044.79905: _low_level_execute_command(): starting
  9311 1726773044.79910: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773044.699862-9311-2755908257431/ /root/.ansible/tmp/ansible-tmp-1726773044.699862-9311-2755908257431/AnsiballZ_setup.py && sleep 0'
  9311 1726773044.82574: stderr chunk (state=2):
>>><<<
  9311 1726773044.82589: stdout chunk (state=2):
>>><<<
  9311 1726773044.82605: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9311 1726773044.82609: _low_level_execute_command(): starting
  9311 1726773044.82615: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773044.699862-9311-2755908257431/AnsiballZ_setup.py && sleep 0'
  9311 1726773045.10792: stdout chunk (state=2):
>>>
{"ansible_facts": {"ansible_service_mgr": "systemd"}, "invocation": {"module_args": {"gather_subset": ["!all"], "filter": ["ansible_service_mgr"], "gather_timeout": 10, "fact_path": "/etc/ansible/facts.d"}}}
<<<
  9311 1726773045.12598: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  9311 1726773045.12642: stderr chunk (state=3):
>>><<<
  9311 1726773045.12649: stdout chunk (state=3):
>>><<<
  9311 1726773045.12669: _low_level_execute_command() done: rc=0, stdout=
{"ansible_facts": {"ansible_service_mgr": "systemd"}, "invocation": {"module_args": {"gather_subset": ["!all"], "filter": ["ansible_service_mgr"], "gather_timeout": 10, "fact_path": "/etc/ansible/facts.d"}}}
, stderr=Shared connection to 10.31.47.99 closed.
  9311 1726773045.12703: done with _execute_module (ansible.legacy.setup, {'gather_subset': '!all', 'filter': 'ansible_service_mgr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773044.699862-9311-2755908257431/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  9311 1726773045.12723: Facts {'ansible_facts': {'ansible_service_mgr': 'systemd'}, 'invocation': {'module_args': {'gather_subset': ['!all'], 'filter': ['ansible_service_mgr'], 'gather_timeout': 10, 'fact_path': '/etc/ansible/facts.d'}}, '_ansible_parsed': True}
  9311 1726773045.12787: variable 'ansible_module_compression' from source: unknown
  9311 1726773045.12830: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED
  9311 1726773045.12882: variable 'ansible_facts' from source: unknown
  9311 1726773045.13108: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773044.699862-9311-2755908257431/AnsiballZ_systemd.py
  9311 1726773045.13553: Sending initial data
  9311 1726773045.13560: Sent initial data (151 bytes)
  9311 1726773045.16204: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmp5lf1tbtd /root/.ansible/tmp/ansible-tmp-1726773044.699862-9311-2755908257431/AnsiballZ_systemd.py
<<<
  9311 1726773045.18607: stderr chunk (state=3):
>>><<<
  9311 1726773045.18617: stdout chunk (state=3):
>>><<<
  9311 1726773045.18641: done transferring module to remote
  9311 1726773045.18651: _low_level_execute_command(): starting
  9311 1726773045.18658: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773044.699862-9311-2755908257431/ /root/.ansible/tmp/ansible-tmp-1726773044.699862-9311-2755908257431/AnsiballZ_systemd.py && sleep 0'
  9311 1726773045.21492: stderr chunk (state=2):
>>><<<
  9311 1726773045.21509: stdout chunk (state=3):
>>><<<
  9311 1726773045.21520: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9311 1726773045.21525: _low_level_execute_command(): starting
  9311 1726773045.21530: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773044.699862-9311-2755908257431/AnsiballZ_systemd.py && sleep 0'
  9311 1726773045.49802: stdout chunk (state=2):
>>>
{"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:38 EDT", "WatchdogTimestampMonotonic": "453344532", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "9802", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ExecMainStartTimestampMonotonic": "453204995", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "9802", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:38 EDT] ; stop_time=[n/a] ; pid=9802 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "17063936", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryH<<<
  9311 1726773045.49822: stdout chunk (state=3):
>>>igh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "power-profiles-daemon.service auto-cpufreq.service tlp.service cpupower.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "systemd-journald.socket polkit.service systemd-sysctl.service dbus.socket basic.target system.slice network.target dbus.service sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:38 EDT", "StateChangeTimestampMonotonic": "453344536", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:38 EDT", "InactiveExitTimestampMonotonic": "453205057", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ActiveEnterTimestampMonotonic": "453344536", "ActiveExitTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ActiveExitTimestampMonotonic": "453097312", "InactiveEnterTimestamp": "Thu 2024-09-19 15:10:38 EDT", "InactiveEnterTimestampMonotonic": "453201635", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ConditionTimestampMonotonic": "453202685", "AssertTimestamp": "Thu 2024-09-19 15:10:38 EDT", "AssertTimestampMonotonic": "453202686", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "29d42365ee9e42d1916b9ebf15b9284e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}}
<<<
  9311 1726773045.51540: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  9311 1726773045.51593: stderr chunk (state=3):
>>><<<
  9311 1726773045.51601: stdout chunk (state=3):
>>><<<
  9311 1726773045.51620: _low_level_execute_command() done: rc=0, stdout=
{"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:38 EDT", "WatchdogTimestampMonotonic": "453344532", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "9802", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ExecMainStartTimestampMonotonic": "453204995", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "9802", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:38 EDT] ; stop_time=[n/a] ; pid=9802 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "17063936", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "power-profiles-daemon.service auto-cpufreq.service tlp.service cpupower.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "systemd-journald.socket polkit.service systemd-sysctl.service dbus.socket basic.target system.slice network.target dbus.service sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:38 EDT", "StateChangeTimestampMonotonic": "453344536", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:38 EDT", "InactiveExitTimestampMonotonic": "453205057", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ActiveEnterTimestampMonotonic": "453344536", "ActiveExitTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ActiveExitTimestampMonotonic": "453097312", "InactiveEnterTimestamp": "Thu 2024-09-19 15:10:38 EDT", "InactiveEnterTimestampMonotonic": "453201635", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ConditionTimestampMonotonic": "453202685", "AssertTimestamp": "Thu 2024-09-19 15:10:38 EDT", "AssertTimestampMonotonic": "453202686", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "29d42365ee9e42d1916b9ebf15b9284e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}}
, stderr=Shared connection to 10.31.47.99 closed.
  9311 1726773045.51865: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773044.699862-9311-2755908257431/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  9311 1726773045.51885: _low_level_execute_command(): starting
  9311 1726773045.51892: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773044.699862-9311-2755908257431/ > /dev/null 2>&1 && sleep 0'
  9311 1726773045.54400: stderr chunk (state=2):
>>><<<
  9311 1726773045.54408: stdout chunk (state=2):
>>><<<
  9311 1726773045.54422: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9311 1726773045.54429: handler run complete
  9311 1726773045.54462: attempt loop complete, returning result
  9311 1726773045.54481: variable 'item' from source: unknown
  9311 1726773045.54543: variable 'item' from source: unknown
ok: [managed_node3] => (item=tuned) => {
    "ansible_loop_var": "item",
    "changed": false,
    "enabled": true,
    "item": "tuned",
    "name": "tuned",
    "state": "started",
    "status": {
        "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:38 EDT",
        "ActiveEnterTimestampMonotonic": "453344536",
        "ActiveExitTimestamp": "Thu 2024-09-19 15:10:38 EDT",
        "ActiveExitTimestampMonotonic": "453097312",
        "ActiveState": "active",
        "After": "systemd-journald.socket polkit.service systemd-sysctl.service dbus.socket basic.target system.slice network.target dbus.service sysinit.target",
        "AllowIsolate": "no",
        "AllowedCPUs": "",
        "AllowedMemoryNodes": "",
        "AmbientCapabilities": "",
        "AssertResult": "yes",
        "AssertTimestamp": "Thu 2024-09-19 15:10:38 EDT",
        "AssertTimestampMonotonic": "453202686",
        "Before": "shutdown.target multi-user.target",
        "BlockIOAccounting": "no",
        "BlockIOWeight": "[not set]",
        "BusName": "com.redhat.tuned",
        "CPUAccounting": "no",
        "CPUAffinity": "",
        "CPUAffinityFromNUMA": "no",
        "CPUQuotaPerSecUSec": "infinity",
        "CPUQuotaPeriodUSec": "infinity",
        "CPUSchedulingPolicy": "0",
        "CPUSchedulingPriority": "0",
        "CPUSchedulingResetOnFork": "no",
        "CPUShares": "[not set]",
        "CPUUsageNSec": "[not set]",
        "CPUWeight": "[not set]",
        "CacheDirectoryMode": "0755",
        "CanFreeze": "yes",
        "CanIsolate": "no",
        "CanReload": "no",
        "CanStart": "yes",
        "CanStop": "yes",
        "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf",
        "CollectMode": "inactive",
        "ConditionResult": "yes",
        "ConditionTimestamp": "Thu 2024-09-19 15:10:38 EDT",
        "ConditionTimestampMonotonic": "453202685",
        "ConfigurationDirectoryMode": "0755",
        "Conflicts": "power-profiles-daemon.service auto-cpufreq.service tlp.service cpupower.service shutdown.target",
        "ControlGroup": "/system.slice/tuned.service",
        "ControlPID": "0",
        "DefaultDependencies": "yes",
        "DefaultMemoryLow": "0",
        "DefaultMemoryMin": "0",
        "Delegate": "no",
        "Description": "Dynamic System Tuning Daemon",
        "DevicePolicy": "auto",
        "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)",
        "DynamicUser": "no",
        "EffectiveCPUs": "",
        "EffectiveMemoryNodes": "",
        "ExecMainCode": "0",
        "ExecMainExitTimestampMonotonic": "0",
        "ExecMainPID": "9802",
        "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:38 EDT",
        "ExecMainStartTimestampMonotonic": "453204995",
        "ExecMainStatus": "0",
        "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:38 EDT] ; stop_time=[n/a] ; pid=9802 ; code=(null) ; status=0/0 }",
        "FailureAction": "none",
        "FileDescriptorStoreMax": "0",
        "FragmentPath": "/usr/lib/systemd/system/tuned.service",
        "FreezerState": "running",
        "GID": "[not set]",
        "GuessMainPID": "yes",
        "IOAccounting": "no",
        "IOSchedulingClass": "0",
        "IOSchedulingPriority": "0",
        "IOWeight": "[not set]",
        "IPAccounting": "no",
        "IPEgressBytes": "18446744073709551615",
        "IPEgressPackets": "18446744073709551615",
        "IPIngressBytes": "18446744073709551615",
        "IPIngressPackets": "18446744073709551615",
        "Id": "tuned.service",
        "IgnoreOnIsolate": "no",
        "IgnoreSIGPIPE": "yes",
        "InactiveEnterTimestamp": "Thu 2024-09-19 15:10:38 EDT",
        "InactiveEnterTimestampMonotonic": "453201635",
        "InactiveExitTimestamp": "Thu 2024-09-19 15:10:38 EDT",
        "InactiveExitTimestampMonotonic": "453205057",
        "InvocationID": "29d42365ee9e42d1916b9ebf15b9284e",
        "JobRunningTimeoutUSec": "infinity",
        "JobTimeoutAction": "none",
        "JobTimeoutUSec": "infinity",
        "KeyringMode": "private",
        "KillMode": "control-group",
        "KillSignal": "15",
        "LimitAS": "infinity",
        "LimitASSoft": "infinity",
        "LimitCORE": "infinity",
        "LimitCORESoft": "0",
        "LimitCPU": "infinity",
        "LimitCPUSoft": "infinity",
        "LimitDATA": "infinity",
        "LimitDATASoft": "infinity",
        "LimitFSIZE": "infinity",
        "LimitFSIZESoft": "infinity",
        "LimitLOCKS": "infinity",
        "LimitLOCKSSoft": "infinity",
        "LimitMEMLOCK": "65536",
        "LimitMEMLOCKSoft": "65536",
        "LimitMSGQUEUE": "819200",
        "LimitMSGQUEUESoft": "819200",
        "LimitNICE": "0",
        "LimitNICESoft": "0",
        "LimitNOFILE": "262144",
        "LimitNOFILESoft": "1024",
        "LimitNPROC": "14003",
        "LimitNPROCSoft": "14003",
        "LimitRSS": "infinity",
        "LimitRSSSoft": "infinity",
        "LimitRTPRIO": "0",
        "LimitRTPRIOSoft": "0",
        "LimitRTTIME": "infinity",
        "LimitRTTIMESoft": "infinity",
        "LimitSIGPENDING": "14003",
        "LimitSIGPENDINGSoft": "14003",
        "LimitSTACK": "infinity",
        "LimitSTACKSoft": "8388608",
        "LoadState": "loaded",
        "LockPersonality": "no",
        "LogLevelMax": "-1",
        "LogRateLimitBurst": "0",
        "LogRateLimitIntervalUSec": "0",
        "LogsDirectoryMode": "0755",
        "MainPID": "9802",
        "MemoryAccounting": "yes",
        "MemoryCurrent": "17063936",
        "MemoryDenyWriteExecute": "no",
        "MemoryHigh": "infinity",
        "MemoryLimit": "infinity",
        "MemoryLow": "0",
        "MemoryMax": "infinity",
        "MemoryMin": "0",
        "MemorySwapMax": "infinity",
        "MountAPIVFS": "no",
        "MountFlags": "",
        "NFileDescriptorStore": "0",
        "NRestarts": "0",
        "NUMAMask": "",
        "NUMAPolicy": "n/a",
        "Names": "tuned.service",
        "NeedDaemonReload": "no",
        "Nice": "0",
        "NoNewPrivileges": "no",
        "NonBlocking": "no",
        "NotifyAccess": "none",
        "OOMScoreAdjust": "0",
        "OnFailureJobMode": "replace",
        "PIDFile": "/run/tuned/tuned.pid",
        "PermissionsStartOnly": "no",
        "Perpetual": "no",
        "PrivateDevices": "no",
        "PrivateMounts": "no",
        "PrivateNetwork": "no",
        "PrivateTmp": "no",
        "PrivateUsers": "no",
        "ProtectControlGroups": "no",
        "ProtectHome": "no",
        "ProtectKernelModules": "no",
        "ProtectKernelTunables": "no",
        "ProtectSystem": "no",
        "RefuseManualStart": "no",
        "RefuseManualStop": "no",
        "RemainAfterExit": "no",
        "RemoveIPC": "no",
        "Requires": "system.slice sysinit.target dbus.service dbus.socket",
        "Restart": "no",
        "RestartUSec": "100ms",
        "RestrictNamespaces": "no",
        "RestrictRealtime": "no",
        "RestrictSUIDSGID": "no",
        "Result": "success",
        "RootDirectoryStartOnly": "no",
        "RuntimeDirectoryMode": "0755",
        "RuntimeDirectoryPreserve": "no",
        "RuntimeMaxUSec": "infinity",
        "SameProcessGroup": "no",
        "SecureBits": "0",
        "SendSIGHUP": "no",
        "SendSIGKILL": "yes",
        "Slice": "system.slice",
        "StandardError": "inherit",
        "StandardInput": "null",
        "StandardInputData": "",
        "StandardOutput": "journal",
        "StartLimitAction": "none",
        "StartLimitBurst": "5",
        "StartLimitIntervalUSec": "10s",
        "StartupBlockIOWeight": "[not set]",
        "StartupCPUShares": "[not set]",
        "StartupCPUWeight": "[not set]",
        "StartupIOWeight": "[not set]",
        "StateChangeTimestamp": "Thu 2024-09-19 15:10:38 EDT",
        "StateChangeTimestampMonotonic": "453344536",
        "StateDirectoryMode": "0755",
        "StatusErrno": "0",
        "StopWhenUnneeded": "no",
        "SubState": "running",
        "SuccessAction": "none",
        "SyslogFacility": "3",
        "SyslogLevel": "6",
        "SyslogLevelPrefix": "yes",
        "SyslogPriority": "30",
        "SystemCallErrorNumber": "0",
        "TTYReset": "no",
        "TTYVHangup": "no",
        "TTYVTDisallocate": "no",
        "TasksAccounting": "yes",
        "TasksCurrent": "4",
        "TasksMax": "22405",
        "TimeoutStartUSec": "1min 30s",
        "TimeoutStopUSec": "1min 30s",
        "TimerSlackNSec": "50000",
        "Transient": "no",
        "Type": "dbus",
        "UID": "[not set]",
        "UMask": "0022",
        "UnitFilePreset": "enabled",
        "UnitFileState": "enabled",
        "UtmpMode": "init",
        "WantedBy": "multi-user.target",
        "WatchdogTimestamp": "Thu 2024-09-19 15:10:38 EDT",
        "WatchdogTimestampMonotonic": "453344532",
        "WatchdogUSec": "0"
    }
}
  9311 1726773045.54640: dumping result to json
  9311 1726773045.54661: done dumping result, returning
  9311 1726773045.54669: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started [0affffe7-6841-6cfb-81ae-000000000200]
  9311 1726773045.54676: sending task result for task 0affffe7-6841-6cfb-81ae-000000000200
  9311 1726773045.54795: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000200
  9311 1726773045.54800: WORKER PROCESS EXITING
  8303 1726773045.55346: no more pending results, returning what we have
  8303 1726773045.55349: results queue empty
  8303 1726773045.55350: checking for any_errors_fatal
  8303 1726773045.55353: done checking for any_errors_fatal
  8303 1726773045.55354: checking for max_fail_percentage
  8303 1726773045.55355: done checking for max_fail_percentage
  8303 1726773045.55356: checking to see if all hosts have failed and the running result is not ok
  8303 1726773045.55356: done checking to see if all hosts have failed
  8303 1726773045.55357: getting the remaining hosts for this loop
  8303 1726773045.55358: done getting the remaining hosts for this loop
  8303 1726773045.55361: getting the next task for host managed_node3
  8303 1726773045.55369: done getting next task for host managed_node3
  8303 1726773045.55372:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists
  8303 1726773045.55375:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773045.55384: getting variables
  8303 1726773045.55387: in VariableManager get_vars()
  8303 1726773045.55410: Calling all_inventory to load vars for managed_node3
  8303 1726773045.55413: Calling groups_inventory to load vars for managed_node3
  8303 1726773045.55415: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773045.55423: Calling all_plugins_play to load vars for managed_node3
  8303 1726773045.55426: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773045.55428: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773045.55478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773045.55521: done with get_vars()
  8303 1726773045.55528: done getting variables

TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74
Thursday 19 September 2024  15:10:45 -0400 (0:00:00.927)       0:00:22.134 **** 
  8303 1726773045.55622: entering _queue_task() for managed_node3/file
  8303 1726773045.55825: worker is 1 (out of 1 available)
  8303 1726773045.55839: exiting _queue_task() for managed_node3/file
  8303 1726773045.55851: done queuing things up, now waiting for results queue to drain
  8303 1726773045.55853: waiting for pending results...
  9336 1726773045.56065: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists
  9336 1726773045.56217: in run() - task 0affffe7-6841-6cfb-81ae-000000000201
  9336 1726773045.56235: variable 'ansible_search_path' from source: unknown
  9336 1726773045.56240: variable 'ansible_search_path' from source: unknown
  9336 1726773045.56277: calling self._execute()
  9336 1726773045.56357: variable 'ansible_host' from source: host vars for 'managed_node3'
  9336 1726773045.56370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9336 1726773045.56380: variable 'omit' from source: magic vars
  9336 1726773045.56481: variable 'omit' from source: magic vars
  9336 1726773045.56536: variable 'omit' from source: magic vars
  9336 1726773045.56563: variable '__kernel_settings_profile_dir' from source: role '' all vars
  9336 1726773045.56852: variable '__kernel_settings_profile_dir' from source: role '' all vars
  9336 1726773045.57233: variable '__kernel_settings_profile_parent' from source: set_fact
  9336 1726773045.57242: variable '__kernel_settings_tuned_profile' from source: role '' all vars
  9336 1726773045.57288: variable 'omit' from source: magic vars
  9336 1726773045.57325: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  9336 1726773045.57359: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  9336 1726773045.57384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  9336 1726773045.57404: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9336 1726773045.57416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9336 1726773045.57444: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  9336 1726773045.57449: variable 'ansible_host' from source: host vars for 'managed_node3'
  9336 1726773045.57453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9336 1726773045.57555: Set connection var ansible_pipelining to False
  9336 1726773045.57569: Set connection var ansible_timeout to 10
  9336 1726773045.57576: Set connection var ansible_module_compression to ZIP_DEFLATED
  9336 1726773045.57582: Set connection var ansible_shell_executable to /bin/sh
  9336 1726773045.57587: Set connection var ansible_connection to ssh
  9336 1726773045.57594: Set connection var ansible_shell_type to sh
  9336 1726773045.57612: variable 'ansible_shell_executable' from source: unknown
  9336 1726773045.57616: variable 'ansible_connection' from source: unknown
  9336 1726773045.57619: variable 'ansible_module_compression' from source: unknown
  9336 1726773045.57622: variable 'ansible_shell_type' from source: unknown
  9336 1726773045.57624: variable 'ansible_shell_executable' from source: unknown
  9336 1726773045.57627: variable 'ansible_host' from source: host vars for 'managed_node3'
  9336 1726773045.57630: variable 'ansible_pipelining' from source: unknown
  9336 1726773045.57632: variable 'ansible_timeout' from source: unknown
  9336 1726773045.57636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9336 1726773045.57822: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action)
  9336 1726773045.57834: variable 'omit' from source: magic vars
  9336 1726773045.57840: starting attempt loop
  9336 1726773045.57843: running the handler
  9336 1726773045.57855: _low_level_execute_command(): starting
  9336 1726773045.57863: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  9336 1726773045.60416: stdout chunk (state=2):
>>>/root
<<<
  9336 1726773045.60535: stderr chunk (state=3):
>>><<<
  9336 1726773045.60543: stdout chunk (state=3):
>>><<<
  9336 1726773045.60567: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  9336 1726773045.60583: _low_level_execute_command(): starting
  9336 1726773045.60592: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773045.6057682-9336-39087586682607 `" && echo ansible-tmp-1726773045.6057682-9336-39087586682607="` echo /root/.ansible/tmp/ansible-tmp-1726773045.6057682-9336-39087586682607 `" ) && sleep 0'
  9336 1726773045.63128: stdout chunk (state=2):
>>>ansible-tmp-1726773045.6057682-9336-39087586682607=/root/.ansible/tmp/ansible-tmp-1726773045.6057682-9336-39087586682607
<<<
  9336 1726773045.63257: stderr chunk (state=3):
>>><<<
  9336 1726773045.63267: stdout chunk (state=3):
>>><<<
  9336 1726773045.63284: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773045.6057682-9336-39087586682607=/root/.ansible/tmp/ansible-tmp-1726773045.6057682-9336-39087586682607
, stderr=
  9336 1726773045.63323: variable 'ansible_module_compression' from source: unknown
  9336 1726773045.63369: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED
  9336 1726773045.63407: variable 'ansible_facts' from source: unknown
  9336 1726773045.63480: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773045.6057682-9336-39087586682607/AnsiballZ_file.py
  9336 1726773045.63584: Sending initial data
  9336 1726773045.63594: Sent initial data (150 bytes)
  9336 1726773045.66237: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpka24412j /root/.ansible/tmp/ansible-tmp-1726773045.6057682-9336-39087586682607/AnsiballZ_file.py
<<<
  9336 1726773045.67573: stderr chunk (state=3):
>>><<<
  9336 1726773045.67583: stdout chunk (state=3):
>>><<<
  9336 1726773045.67605: done transferring module to remote
  9336 1726773045.67616: _low_level_execute_command(): starting
  9336 1726773045.67621: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773045.6057682-9336-39087586682607/ /root/.ansible/tmp/ansible-tmp-1726773045.6057682-9336-39087586682607/AnsiballZ_file.py && sleep 0'
  9336 1726773045.70077: stderr chunk (state=2):
>>><<<
  9336 1726773045.70089: stdout chunk (state=2):
>>><<<
  9336 1726773045.70106: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9336 1726773045.70111: _low_level_execute_command(): starting
  9336 1726773045.70116: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773045.6057682-9336-39087586682607/AnsiballZ_file.py && sleep 0'
  9336 1726773045.86372: stdout chunk (state=2):
>>>
{"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}}
<<<
  9336 1726773045.87529: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  9336 1726773045.87586: stderr chunk (state=3):
>>><<<
  9336 1726773045.87593: stdout chunk (state=3):
>>><<<
  9336 1726773045.87610: _low_level_execute_command() done: rc=0, stdout=
{"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}}
, stderr=Shared connection to 10.31.47.99 closed.
  9336 1726773045.87643: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'directory', 'mode': '0755', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773045.6057682-9336-39087586682607/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  9336 1726773045.87653: _low_level_execute_command(): starting
  9336 1726773045.87659: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773045.6057682-9336-39087586682607/ > /dev/null 2>&1 && sleep 0'
  9336 1726773045.90136: stderr chunk (state=2):
>>><<<
  9336 1726773045.90146: stdout chunk (state=2):
>>><<<
  9336 1726773045.90162: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9336 1726773045.90168: handler run complete
  9336 1726773045.90190: attempt loop complete, returning result
  9336 1726773045.90195: _execute() done
  9336 1726773045.90198: dumping result to json
  9336 1726773045.90204: done dumping result, returning
  9336 1726773045.90212: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists [0affffe7-6841-6cfb-81ae-000000000201]
  9336 1726773045.90218: sending task result for task 0affffe7-6841-6cfb-81ae-000000000201
  9336 1726773045.90252: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000201
  9336 1726773045.90256: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "changed": false,
    "gid": 0,
    "group": "root",
    "mode": "0755",
    "owner": "root",
    "path": "/etc/tuned/kernel_settings",
    "secontext": "unconfined_u:object_r:tuned_etc_t:s0",
    "size": 24,
    "state": "directory",
    "uid": 0
}
  8303 1726773045.90427: no more pending results, returning what we have
  8303 1726773045.90430: results queue empty
  8303 1726773045.90430: checking for any_errors_fatal
  8303 1726773045.90449: done checking for any_errors_fatal
  8303 1726773045.90450: checking for max_fail_percentage
  8303 1726773045.90452: done checking for max_fail_percentage
  8303 1726773045.90452: checking to see if all hosts have failed and the running result is not ok
  8303 1726773045.90453: done checking to see if all hosts have failed
  8303 1726773045.90453: getting the remaining hosts for this loop
  8303 1726773045.90454: done getting the remaining hosts for this loop
  8303 1726773045.90458: getting the next task for host managed_node3
  8303 1726773045.90463: done getting next task for host managed_node3
  8303 1726773045.90468:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile
  8303 1726773045.90471:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773045.90481: getting variables
  8303 1726773045.90482: in VariableManager get_vars()
  8303 1726773045.90511: Calling all_inventory to load vars for managed_node3
  8303 1726773045.90513: Calling groups_inventory to load vars for managed_node3
  8303 1726773045.90514: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773045.90521: Calling all_plugins_play to load vars for managed_node3
  8303 1726773045.90523: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773045.90524: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773045.90560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773045.90598: done with get_vars()
  8303 1726773045.90605: done getting variables

TASK [fedora.linux_system_roles.kernel_settings : Get active_profile] **********
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:80
Thursday 19 September 2024  15:10:45 -0400 (0:00:00.350)       0:00:22.485 **** 
  8303 1726773045.90671: entering _queue_task() for managed_node3/slurp
  8303 1726773045.90846: worker is 1 (out of 1 available)
  8303 1726773045.90861: exiting _queue_task() for managed_node3/slurp
  8303 1726773045.90877: done queuing things up, now waiting for results queue to drain
  8303 1726773045.90878: waiting for pending results...
  9352 1726773045.90997: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile
  9352 1726773045.91116: in run() - task 0affffe7-6841-6cfb-81ae-000000000202
  9352 1726773045.91133: variable 'ansible_search_path' from source: unknown
  9352 1726773045.91138: variable 'ansible_search_path' from source: unknown
  9352 1726773045.91168: calling self._execute()
  9352 1726773045.91227: variable 'ansible_host' from source: host vars for 'managed_node3'
  9352 1726773045.91236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9352 1726773045.91245: variable 'omit' from source: magic vars
  9352 1726773045.91319: variable 'omit' from source: magic vars
  9352 1726773045.91358: variable 'omit' from source: magic vars
  9352 1726773045.91376: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars
  9352 1726773045.91601: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars
  9352 1726773045.91660: variable '__kernel_settings_tuned_dir' from source: role '' all vars
  9352 1726773045.91688: variable 'omit' from source: magic vars
  9352 1726773045.91718: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  9352 1726773045.91741: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  9352 1726773045.91758: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  9352 1726773045.91772: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9352 1726773045.91782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9352 1726773045.91805: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  9352 1726773045.91808: variable 'ansible_host' from source: host vars for 'managed_node3'
  9352 1726773045.91811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9352 1726773045.91876: Set connection var ansible_pipelining to False
  9352 1726773045.91893: Set connection var ansible_timeout to 10
  9352 1726773045.91900: Set connection var ansible_module_compression to ZIP_DEFLATED
  9352 1726773045.91906: Set connection var ansible_shell_executable to /bin/sh
  9352 1726773045.91910: Set connection var ansible_connection to ssh
  9352 1726773045.91918: Set connection var ansible_shell_type to sh
  9352 1726773045.91933: variable 'ansible_shell_executable' from source: unknown
  9352 1726773045.91938: variable 'ansible_connection' from source: unknown
  9352 1726773045.91941: variable 'ansible_module_compression' from source: unknown
  9352 1726773045.91944: variable 'ansible_shell_type' from source: unknown
  9352 1726773045.91948: variable 'ansible_shell_executable' from source: unknown
  9352 1726773045.91951: variable 'ansible_host' from source: host vars for 'managed_node3'
  9352 1726773045.91955: variable 'ansible_pipelining' from source: unknown
  9352 1726773045.91959: variable 'ansible_timeout' from source: unknown
  9352 1726773045.91963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9352 1726773045.92106: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action)
  9352 1726773045.92116: variable 'omit' from source: magic vars
  9352 1726773045.92122: starting attempt loop
  9352 1726773045.92125: running the handler
  9352 1726773045.92135: _low_level_execute_command(): starting
  9352 1726773045.92142: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  9352 1726773045.94477: stdout chunk (state=2):
>>>/root
<<<
  9352 1726773045.94606: stderr chunk (state=3):
>>><<<
  9352 1726773045.94613: stdout chunk (state=3):
>>><<<
  9352 1726773045.94632: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  9352 1726773045.94646: _low_level_execute_command(): starting
  9352 1726773045.94652: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773045.9464004-9352-235962881933281 `" && echo ansible-tmp-1726773045.9464004-9352-235962881933281="` echo /root/.ansible/tmp/ansible-tmp-1726773045.9464004-9352-235962881933281 `" ) && sleep 0'
  9352 1726773045.97180: stdout chunk (state=2):
>>>ansible-tmp-1726773045.9464004-9352-235962881933281=/root/.ansible/tmp/ansible-tmp-1726773045.9464004-9352-235962881933281
<<<
  9352 1726773045.97308: stderr chunk (state=3):
>>><<<
  9352 1726773045.97316: stdout chunk (state=3):
>>><<<
  9352 1726773045.97333: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773045.9464004-9352-235962881933281=/root/.ansible/tmp/ansible-tmp-1726773045.9464004-9352-235962881933281
, stderr=
  9352 1726773045.97374: variable 'ansible_module_compression' from source: unknown
  9352 1726773045.97413: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.slurp-ZIP_DEFLATED
  9352 1726773045.97444: variable 'ansible_facts' from source: unknown
  9352 1726773045.97519: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773045.9464004-9352-235962881933281/AnsiballZ_slurp.py
  9352 1726773045.97625: Sending initial data
  9352 1726773045.97632: Sent initial data (152 bytes)
  9352 1726773046.00263: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmp_u2vi33x /root/.ansible/tmp/ansible-tmp-1726773045.9464004-9352-235962881933281/AnsiballZ_slurp.py
<<<
  9352 1726773046.01426: stderr chunk (state=3):
>>><<<
  9352 1726773046.01435: stdout chunk (state=3):
>>><<<
  9352 1726773046.01457: done transferring module to remote
  9352 1726773046.01469: _low_level_execute_command(): starting
  9352 1726773046.01475: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773045.9464004-9352-235962881933281/ /root/.ansible/tmp/ansible-tmp-1726773045.9464004-9352-235962881933281/AnsiballZ_slurp.py && sleep 0'
  9352 1726773046.03932: stderr chunk (state=2):
>>><<<
  9352 1726773046.03943: stdout chunk (state=2):
>>><<<
  9352 1726773046.03960: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9352 1726773046.03965: _low_level_execute_command(): starting
  9352 1726773046.03970: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773045.9464004-9352-235962881933281/AnsiballZ_slurp.py && sleep 0'
  9352 1726773046.19017: stdout chunk (state=2):
>>>
{"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}}
<<<
  9352 1726773046.20077: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  9352 1726773046.20127: stderr chunk (state=3):
>>><<<
  9352 1726773046.20135: stdout chunk (state=3):
>>><<<
  9352 1726773046.20150: _low_level_execute_command() done: rc=0, stdout=
{"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}}
, stderr=Shared connection to 10.31.47.99 closed.
  9352 1726773046.20173: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773045.9464004-9352-235962881933281/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  9352 1726773046.20187: _low_level_execute_command(): starting
  9352 1726773046.20193: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773045.9464004-9352-235962881933281/ > /dev/null 2>&1 && sleep 0'
  9352 1726773046.22711: stderr chunk (state=2):
>>><<<
  9352 1726773046.22722: stdout chunk (state=2):
>>><<<
  9352 1726773046.22738: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9352 1726773046.22744: handler run complete
  9352 1726773046.22757: attempt loop complete, returning result
  9352 1726773046.22761: _execute() done
  9352 1726773046.22765: dumping result to json
  9352 1726773046.22772: done dumping result, returning
  9352 1726773046.22781: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile [0affffe7-6841-6cfb-81ae-000000000202]
  9352 1726773046.22788: sending task result for task 0affffe7-6841-6cfb-81ae-000000000202
  9352 1726773046.22818: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000202
  9352 1726773046.22822: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "changed": false,
    "content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK",
    "encoding": "base64",
    "source": "/etc/tuned/active_profile"
}
  8303 1726773046.23042: no more pending results, returning what we have
  8303 1726773046.23045: results queue empty
  8303 1726773046.23045: checking for any_errors_fatal
  8303 1726773046.23051: done checking for any_errors_fatal
  8303 1726773046.23052: checking for max_fail_percentage
  8303 1726773046.23053: done checking for max_fail_percentage
  8303 1726773046.23053: checking to see if all hosts have failed and the running result is not ok
  8303 1726773046.23053: done checking to see if all hosts have failed
  8303 1726773046.23054: getting the remaining hosts for this loop
  8303 1726773046.23055: done getting the remaining hosts for this loop
  8303 1726773046.23057: getting the next task for host managed_node3
  8303 1726773046.23061: done getting next task for host managed_node3
  8303 1726773046.23063:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile
  8303 1726773046.23066:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773046.23073: getting variables
  8303 1726773046.23074: in VariableManager get_vars()
  8303 1726773046.23102: Calling all_inventory to load vars for managed_node3
  8303 1726773046.23104: Calling groups_inventory to load vars for managed_node3
  8303 1726773046.23105: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773046.23112: Calling all_plugins_play to load vars for managed_node3
  8303 1726773046.23113: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773046.23115: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773046.23152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773046.23188: done with get_vars()
  8303 1726773046.23194: done getting variables
  8303 1726773046.23234: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Set active_profile] **********
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:85
Thursday 19 September 2024  15:10:46 -0400 (0:00:00.325)       0:00:22.811 **** 
  8303 1726773046.23259: entering _queue_task() for managed_node3/set_fact
  8303 1726773046.23428: worker is 1 (out of 1 available)
  8303 1726773046.23443: exiting _queue_task() for managed_node3/set_fact
  8303 1726773046.23456: done queuing things up, now waiting for results queue to drain
  8303 1726773046.23457: waiting for pending results...
  9360 1726773046.23577: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile
  9360 1726773046.23693: in run() - task 0affffe7-6841-6cfb-81ae-000000000203
  9360 1726773046.23710: variable 'ansible_search_path' from source: unknown
  9360 1726773046.23714: variable 'ansible_search_path' from source: unknown
  9360 1726773046.23743: calling self._execute()
  9360 1726773046.23803: variable 'ansible_host' from source: host vars for 'managed_node3'
  9360 1726773046.23813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9360 1726773046.23821: variable 'omit' from source: magic vars
  9360 1726773046.23896: variable 'omit' from source: magic vars
  9360 1726773046.23933: variable 'omit' from source: magic vars
  9360 1726773046.24234: variable '__kernel_settings_tuned_profile' from source: role '' all vars
  9360 1726773046.24245: variable '__cur_profile' from source: task vars
  9360 1726773046.24352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name
  9360 1726773046.25915: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py
  9360 1726773046.25964: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py
  9360 1726773046.25998: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py
  9360 1726773046.26025: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py
  9360 1726773046.26044: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py
  9360 1726773046.26104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  9360 1726773046.26126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  9360 1726773046.26144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  9360 1726773046.26175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  9360 1726773046.26191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  9360 1726773046.26264: variable '__kernel_settings_tuned_current_profile' from source: set_fact
  9360 1726773046.26310: variable 'omit' from source: magic vars
  9360 1726773046.26332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  9360 1726773046.26353: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  9360 1726773046.26371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  9360 1726773046.26386: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9360 1726773046.26397: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9360 1726773046.26422: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  9360 1726773046.26427: variable 'ansible_host' from source: host vars for 'managed_node3'
  9360 1726773046.26431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9360 1726773046.26498: Set connection var ansible_pipelining to False
  9360 1726773046.26509: Set connection var ansible_timeout to 10
  9360 1726773046.26516: Set connection var ansible_module_compression to ZIP_DEFLATED
  9360 1726773046.26522: Set connection var ansible_shell_executable to /bin/sh
  9360 1726773046.26525: Set connection var ansible_connection to ssh
  9360 1726773046.26532: Set connection var ansible_shell_type to sh
  9360 1726773046.26548: variable 'ansible_shell_executable' from source: unknown
  9360 1726773046.26552: variable 'ansible_connection' from source: unknown
  9360 1726773046.26555: variable 'ansible_module_compression' from source: unknown
  9360 1726773046.26558: variable 'ansible_shell_type' from source: unknown
  9360 1726773046.26562: variable 'ansible_shell_executable' from source: unknown
  9360 1726773046.26565: variable 'ansible_host' from source: host vars for 'managed_node3'
  9360 1726773046.26572: variable 'ansible_pipelining' from source: unknown
  9360 1726773046.26576: variable 'ansible_timeout' from source: unknown
  9360 1726773046.26580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9360 1726773046.26644: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  9360 1726773046.26655: variable 'omit' from source: magic vars
  9360 1726773046.26660: starting attempt loop
  9360 1726773046.26664: running the handler
  9360 1726773046.26675: handler run complete
  9360 1726773046.26682: attempt loop complete, returning result
  9360 1726773046.26688: _execute() done
  9360 1726773046.26692: dumping result to json
  9360 1726773046.26696: done dumping result, returning
  9360 1726773046.26702: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile [0affffe7-6841-6cfb-81ae-000000000203]
  9360 1726773046.26708: sending task result for task 0affffe7-6841-6cfb-81ae-000000000203
  9360 1726773046.26730: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000203
  9360 1726773046.26733: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "ansible_facts": {
        "__kernel_settings_active_profile": "virtual-guest kernel_settings"
    },
    "changed": false
}
  8303 1726773046.26875: no more pending results, returning what we have
  8303 1726773046.26878: results queue empty
  8303 1726773046.26879: checking for any_errors_fatal
  8303 1726773046.26887: done checking for any_errors_fatal
  8303 1726773046.26887: checking for max_fail_percentage
  8303 1726773046.26889: done checking for max_fail_percentage
  8303 1726773046.26889: checking to see if all hosts have failed and the running result is not ok
  8303 1726773046.26890: done checking to see if all hosts have failed
  8303 1726773046.26890: getting the remaining hosts for this loop
  8303 1726773046.26891: done getting the remaining hosts for this loop
  8303 1726773046.26894: getting the next task for host managed_node3
  8303 1726773046.26899: done getting next task for host managed_node3
  8303 1726773046.26902:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile
  8303 1726773046.26905:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773046.26920: getting variables
  8303 1726773046.26922: in VariableManager get_vars()
  8303 1726773046.26952: Calling all_inventory to load vars for managed_node3
  8303 1726773046.26955: Calling groups_inventory to load vars for managed_node3
  8303 1726773046.26956: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773046.26964: Calling all_plugins_play to load vars for managed_node3
  8303 1726773046.26966: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773046.26968: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773046.27005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773046.27036: done with get_vars()
  8303 1726773046.27042: done getting variables
  8303 1726773046.27082: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91
Thursday 19 September 2024  15:10:46 -0400 (0:00:00.038)       0:00:22.849 **** 
  8303 1726773046.27106: entering _queue_task() for managed_node3/copy
  8303 1726773046.27268: worker is 1 (out of 1 available)
  8303 1726773046.27283: exiting _queue_task() for managed_node3/copy
  8303 1726773046.27296: done queuing things up, now waiting for results queue to drain
  8303 1726773046.27298: waiting for pending results...
  9361 1726773046.27417: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile
  9361 1726773046.27526: in run() - task 0affffe7-6841-6cfb-81ae-000000000204
  9361 1726773046.27542: variable 'ansible_search_path' from source: unknown
  9361 1726773046.27546: variable 'ansible_search_path' from source: unknown
  9361 1726773046.27577: calling self._execute()
  9361 1726773046.27634: variable 'ansible_host' from source: host vars for 'managed_node3'
  9361 1726773046.27644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9361 1726773046.27653: variable 'omit' from source: magic vars
  9361 1726773046.27727: variable 'omit' from source: magic vars
  9361 1726773046.27769: variable 'omit' from source: magic vars
  9361 1726773046.27792: variable '__kernel_settings_active_profile' from source: set_fact
  9361 1726773046.28013: variable '__kernel_settings_active_profile' from source: set_fact
  9361 1726773046.28036: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars
  9361 1726773046.28094: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars
  9361 1726773046.28145: variable '__kernel_settings_tuned_dir' from source: role '' all vars
  9361 1726773046.28168: variable 'omit' from source: magic vars
  9361 1726773046.28200: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  9361 1726773046.28223: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  9361 1726773046.28238: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  9361 1726773046.28248: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9361 1726773046.28256: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9361 1726773046.28280: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  9361 1726773046.28300: variable 'ansible_host' from source: host vars for 'managed_node3'
  9361 1726773046.28305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9361 1726773046.28374: Set connection var ansible_pipelining to False
  9361 1726773046.28384: Set connection var ansible_timeout to 10
  9361 1726773046.28392: Set connection var ansible_module_compression to ZIP_DEFLATED
  9361 1726773046.28398: Set connection var ansible_shell_executable to /bin/sh
  9361 1726773046.28402: Set connection var ansible_connection to ssh
  9361 1726773046.28408: Set connection var ansible_shell_type to sh
  9361 1726773046.28423: variable 'ansible_shell_executable' from source: unknown
  9361 1726773046.28427: variable 'ansible_connection' from source: unknown
  9361 1726773046.28431: variable 'ansible_module_compression' from source: unknown
  9361 1726773046.28434: variable 'ansible_shell_type' from source: unknown
  9361 1726773046.28438: variable 'ansible_shell_executable' from source: unknown
  9361 1726773046.28441: variable 'ansible_host' from source: host vars for 'managed_node3'
  9361 1726773046.28445: variable 'ansible_pipelining' from source: unknown
  9361 1726773046.28449: variable 'ansible_timeout' from source: unknown
  9361 1726773046.28453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9361 1726773046.28544: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  9361 1726773046.28556: variable 'omit' from source: magic vars
  9361 1726773046.28562: starting attempt loop
  9361 1726773046.28565: running the handler
  9361 1726773046.28577: _low_level_execute_command(): starting
  9361 1726773046.28587: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  9361 1726773046.30998: stdout chunk (state=2):
>>>/root
<<<
  9361 1726773046.31120: stderr chunk (state=3):
>>><<<
  9361 1726773046.31127: stdout chunk (state=3):
>>><<<
  9361 1726773046.31147: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  9361 1726773046.31162: _low_level_execute_command(): starting
  9361 1726773046.31170: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773046.3115678-9361-75932391485559 `" && echo ansible-tmp-1726773046.3115678-9361-75932391485559="` echo /root/.ansible/tmp/ansible-tmp-1726773046.3115678-9361-75932391485559 `" ) && sleep 0'
  9361 1726773046.33767: stdout chunk (state=2):
>>>ansible-tmp-1726773046.3115678-9361-75932391485559=/root/.ansible/tmp/ansible-tmp-1726773046.3115678-9361-75932391485559
<<<
  9361 1726773046.33899: stderr chunk (state=3):
>>><<<
  9361 1726773046.33906: stdout chunk (state=3):
>>><<<
  9361 1726773046.33921: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773046.3115678-9361-75932391485559=/root/.ansible/tmp/ansible-tmp-1726773046.3115678-9361-75932391485559
, stderr=
  9361 1726773046.33998: variable 'ansible_module_compression' from source: unknown
  9361 1726773046.34040: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED
  9361 1726773046.34073: variable 'ansible_facts' from source: unknown
  9361 1726773046.34148: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773046.3115678-9361-75932391485559/AnsiballZ_stat.py
  9361 1726773046.34323: Sending initial data
  9361 1726773046.34330: Sent initial data (150 bytes)
  9361 1726773046.36946: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpw1sjc0fc /root/.ansible/tmp/ansible-tmp-1726773046.3115678-9361-75932391485559/AnsiballZ_stat.py
<<<
  9361 1726773046.38137: stderr chunk (state=3):
>>><<<
  9361 1726773046.38146: stdout chunk (state=3):
>>><<<
  9361 1726773046.38171: done transferring module to remote
  9361 1726773046.38183: _low_level_execute_command(): starting
  9361 1726773046.38190: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773046.3115678-9361-75932391485559/ /root/.ansible/tmp/ansible-tmp-1726773046.3115678-9361-75932391485559/AnsiballZ_stat.py && sleep 0'
  9361 1726773046.40688: stderr chunk (state=2):
>>><<<
  9361 1726773046.40698: stdout chunk (state=2):
>>><<<
  9361 1726773046.40715: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9361 1726773046.40720: _low_level_execute_command(): starting
  9361 1726773046.40726: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773046.3115678-9361-75932391485559/AnsiballZ_stat.py && sleep 0'
  9361 1726773046.57225: stdout chunk (state=2):
>>>
{"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 499122370, "dev": 51713, "nlink": 1, "atime": 1726773046.1880348, "mtime": 1726773038.3080044, "ctime": 1726773038.3080044, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "1754931174", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}}
<<<
  9361 1726773046.58673: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  9361 1726773046.58688: stdout chunk (state=3):
>>><<<
  9361 1726773046.58700: stderr chunk (state=3):
>>><<<
  9361 1726773046.58716: _low_level_execute_command() done: rc=0, stdout=
{"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 499122370, "dev": 51713, "nlink": 1, "atime": 1726773046.1880348, "mtime": 1726773038.3080044, "ctime": 1726773038.3080044, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "1754931174", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}}
, stderr=Shared connection to 10.31.47.99 closed.
  9361 1726773046.58791: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773046.3115678-9361-75932391485559/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  9361 1726773046.58832: variable 'ansible_module_compression' from source: unknown
  9361 1726773046.58875: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED
  9361 1726773046.58903: variable 'ansible_facts' from source: unknown
  9361 1726773046.58993: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773046.3115678-9361-75932391485559/AnsiballZ_file.py
  9361 1726773046.59443: Sending initial data
  9361 1726773046.59450: Sent initial data (150 bytes)
  9361 1726773046.62199: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpl5pfrv40 /root/.ansible/tmp/ansible-tmp-1726773046.3115678-9361-75932391485559/AnsiballZ_file.py
<<<
  9361 1726773046.63445: stderr chunk (state=3):
>>><<<
  9361 1726773046.63454: stdout chunk (state=3):
>>><<<
  9361 1726773046.63475: done transferring module to remote
  9361 1726773046.63487: _low_level_execute_command(): starting
  9361 1726773046.63492: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773046.3115678-9361-75932391485559/ /root/.ansible/tmp/ansible-tmp-1726773046.3115678-9361-75932391485559/AnsiballZ_file.py && sleep 0'
  9361 1726773046.65981: stderr chunk (state=2):
>>><<<
  9361 1726773046.65994: stdout chunk (state=2):
>>><<<
  9361 1726773046.66010: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9361 1726773046.66014: _low_level_execute_command(): starting
  9361 1726773046.66020: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773046.3115678-9361-75932391485559/AnsiballZ_file.py && sleep 0'
  9361 1726773046.82293: stdout chunk (state=2):
>>>
{"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmpd6eh_qsj", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}}
<<<
  9361 1726773046.83453: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  9361 1726773046.83503: stderr chunk (state=3):
>>><<<
  9361 1726773046.83511: stdout chunk (state=3):
>>><<<
  9361 1726773046.83528: _low_level_execute_command() done: rc=0, stdout=
{"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmpd6eh_qsj", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}}
, stderr=Shared connection to 10.31.47.99 closed.
  9361 1726773046.83556: done with _execute_module (ansible.legacy.file, {'mode': '0600', 'dest': '/etc/tuned/active_profile', '_original_basename': 'tmpd6eh_qsj', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773046.3115678-9361-75932391485559/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  9361 1726773046.83567: _low_level_execute_command(): starting
  9361 1726773046.83573: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773046.3115678-9361-75932391485559/ > /dev/null 2>&1 && sleep 0'
  9361 1726773046.86069: stderr chunk (state=2):
>>><<<
  9361 1726773046.86078: stdout chunk (state=2):
>>><<<
  9361 1726773046.86095: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9361 1726773046.86102: handler run complete
  9361 1726773046.86122: attempt loop complete, returning result
  9361 1726773046.86126: _execute() done
  9361 1726773046.86130: dumping result to json
  9361 1726773046.86135: done dumping result, returning
  9361 1726773046.86145: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile [0affffe7-6841-6cfb-81ae-000000000204]
  9361 1726773046.86153: sending task result for task 0affffe7-6841-6cfb-81ae-000000000204
  9361 1726773046.86188: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000204
  9361 1726773046.86192: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "changed": false,
    "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd",
    "dest": "/etc/tuned/active_profile",
    "gid": 0,
    "group": "root",
    "mode": "0600",
    "owner": "root",
    "path": "/etc/tuned/active_profile",
    "secontext": "system_u:object_r:tuned_rw_etc_t:s0",
    "size": 30,
    "state": "file",
    "uid": 0
}
  8303 1726773046.86357: no more pending results, returning what we have
  8303 1726773046.86360: results queue empty
  8303 1726773046.86361: checking for any_errors_fatal
  8303 1726773046.86369: done checking for any_errors_fatal
  8303 1726773046.86369: checking for max_fail_percentage
  8303 1726773046.86371: done checking for max_fail_percentage
  8303 1726773046.86371: checking to see if all hosts have failed and the running result is not ok
  8303 1726773046.86372: done checking to see if all hosts have failed
  8303 1726773046.86372: getting the remaining hosts for this loop
  8303 1726773046.86373: done getting the remaining hosts for this loop
  8303 1726773046.86376: getting the next task for host managed_node3
  8303 1726773046.86384: done getting next task for host managed_node3
  8303 1726773046.86386:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual
  8303 1726773046.86389:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773046.86399: getting variables
  8303 1726773046.86401: in VariableManager get_vars()
  8303 1726773046.86430: Calling all_inventory to load vars for managed_node3
  8303 1726773046.86432: Calling groups_inventory to load vars for managed_node3
  8303 1726773046.86433: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773046.86440: Calling all_plugins_play to load vars for managed_node3
  8303 1726773046.86441: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773046.86443: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773046.86481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773046.86516: done with get_vars()
  8303 1726773046.86522: done getting variables
  8303 1726773046.86561: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Set profile_mode to manual] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99
Thursday 19 September 2024  15:10:46 -0400 (0:00:00.594)       0:00:23.444 **** 
  8303 1726773046.86589: entering _queue_task() for managed_node3/copy
  8303 1726773046.86750: worker is 1 (out of 1 available)
  8303 1726773046.86764: exiting _queue_task() for managed_node3/copy
  8303 1726773046.86778: done queuing things up, now waiting for results queue to drain
  8303 1726773046.86780: waiting for pending results...
  9380 1726773046.86892: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual
  9380 1726773046.86999: in run() - task 0affffe7-6841-6cfb-81ae-000000000205
  9380 1726773046.87014: variable 'ansible_search_path' from source: unknown
  9380 1726773046.87018: variable 'ansible_search_path' from source: unknown
  9380 1726773046.87046: calling self._execute()
  9380 1726773046.87103: variable 'ansible_host' from source: host vars for 'managed_node3'
  9380 1726773046.87113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9380 1726773046.87121: variable 'omit' from source: magic vars
  9380 1726773046.87192: variable 'omit' from source: magic vars
  9380 1726773046.87233: variable 'omit' from source: magic vars
  9380 1726773046.87254: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars
  9380 1726773046.87471: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars
  9380 1726773046.87602: variable '__kernel_settings_tuned_dir' from source: role '' all vars
  9380 1726773046.87630: variable 'omit' from source: magic vars
  9380 1726773046.87662: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  9380 1726773046.87690: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  9380 1726773046.87707: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  9380 1726773046.87720: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9380 1726773046.87732: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9380 1726773046.87755: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  9380 1726773046.87760: variable 'ansible_host' from source: host vars for 'managed_node3'
  9380 1726773046.87764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9380 1726773046.87832: Set connection var ansible_pipelining to False
  9380 1726773046.87842: Set connection var ansible_timeout to 10
  9380 1726773046.87848: Set connection var ansible_module_compression to ZIP_DEFLATED
  9380 1726773046.87854: Set connection var ansible_shell_executable to /bin/sh
  9380 1726773046.87857: Set connection var ansible_connection to ssh
  9380 1726773046.87864: Set connection var ansible_shell_type to sh
  9380 1726773046.87881: variable 'ansible_shell_executable' from source: unknown
  9380 1726773046.87887: variable 'ansible_connection' from source: unknown
  9380 1726773046.87891: variable 'ansible_module_compression' from source: unknown
  9380 1726773046.87894: variable 'ansible_shell_type' from source: unknown
  9380 1726773046.87897: variable 'ansible_shell_executable' from source: unknown
  9380 1726773046.87900: variable 'ansible_host' from source: host vars for 'managed_node3'
  9380 1726773046.87905: variable 'ansible_pipelining' from source: unknown
  9380 1726773046.87908: variable 'ansible_timeout' from source: unknown
  9380 1726773046.87912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9380 1726773046.88003: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  9380 1726773046.88015: variable 'omit' from source: magic vars
  9380 1726773046.88020: starting attempt loop
  9380 1726773046.88024: running the handler
  9380 1726773046.88033: _low_level_execute_command(): starting
  9380 1726773046.88040: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  9380 1726773046.90472: stdout chunk (state=2):
>>>/root
<<<
  9380 1726773046.90582: stderr chunk (state=3):
>>><<<
  9380 1726773046.90591: stdout chunk (state=3):
>>><<<
  9380 1726773046.90616: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  9380 1726773046.90631: _low_level_execute_command(): starting
  9380 1726773046.90637: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773046.906255-9380-141727179191489 `" && echo ansible-tmp-1726773046.906255-9380-141727179191489="` echo /root/.ansible/tmp/ansible-tmp-1726773046.906255-9380-141727179191489 `" ) && sleep 0'
  9380 1726773046.93315: stdout chunk (state=2):
>>>ansible-tmp-1726773046.906255-9380-141727179191489=/root/.ansible/tmp/ansible-tmp-1726773046.906255-9380-141727179191489
<<<
  9380 1726773046.93450: stderr chunk (state=3):
>>><<<
  9380 1726773046.93458: stdout chunk (state=3):
>>><<<
  9380 1726773046.93481: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773046.906255-9380-141727179191489=/root/.ansible/tmp/ansible-tmp-1726773046.906255-9380-141727179191489
, stderr=
  9380 1726773046.93564: variable 'ansible_module_compression' from source: unknown
  9380 1726773046.93628: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED
  9380 1726773046.93662: variable 'ansible_facts' from source: unknown
  9380 1726773046.93736: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773046.906255-9380-141727179191489/AnsiballZ_stat.py
  9380 1726773046.93842: Sending initial data
  9380 1726773046.93849: Sent initial data (150 bytes)
  9380 1726773046.96570: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpy9xvdz6j /root/.ansible/tmp/ansible-tmp-1726773046.906255-9380-141727179191489/AnsiballZ_stat.py
<<<
  9380 1726773046.97763: stderr chunk (state=3):
>>><<<
  9380 1726773046.97775: stdout chunk (state=3):
>>><<<
  9380 1726773046.97797: done transferring module to remote
  9380 1726773046.97808: _low_level_execute_command(): starting
  9380 1726773046.97813: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773046.906255-9380-141727179191489/ /root/.ansible/tmp/ansible-tmp-1726773046.906255-9380-141727179191489/AnsiballZ_stat.py && sleep 0'
  9380 1726773047.00350: stderr chunk (state=2):
>>><<<
  9380 1726773047.00368: stdout chunk (state=2):
>>><<<
  9380 1726773047.00390: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9380 1726773047.00397: _low_level_execute_command(): starting
  9380 1726773047.00405: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773046.906255-9380-141727179191489/AnsiballZ_stat.py && sleep 0'
  9380 1726773047.17091: stdout chunk (state=2):
>>>
{"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 505413763, "dev": 51713, "nlink": 1, "atime": 1726773038.2850044, "mtime": 1726773038.3090043, "ctime": 1726773038.3090043, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "911778068", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}}
<<<
  9380 1726773047.18274: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  9380 1726773047.18322: stderr chunk (state=3):
>>><<<
  9380 1726773047.18329: stdout chunk (state=3):
>>><<<
  9380 1726773047.18345: _low_level_execute_command() done: rc=0, stdout=
{"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 505413763, "dev": 51713, "nlink": 1, "atime": 1726773038.2850044, "mtime": 1726773038.3090043, "ctime": 1726773038.3090043, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "911778068", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}}
, stderr=Shared connection to 10.31.47.99 closed.
  9380 1726773047.18397: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773046.906255-9380-141727179191489/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  9380 1726773047.18432: variable 'ansible_module_compression' from source: unknown
  9380 1726773047.18472: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED
  9380 1726773047.18496: variable 'ansible_facts' from source: unknown
  9380 1726773047.18557: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773046.906255-9380-141727179191489/AnsiballZ_file.py
  9380 1726773047.18654: Sending initial data
  9380 1726773047.18661: Sent initial data (150 bytes)
  9380 1726773047.21320: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpaa71cex4 /root/.ansible/tmp/ansible-tmp-1726773046.906255-9380-141727179191489/AnsiballZ_file.py
<<<
  9380 1726773047.22591: stderr chunk (state=3):
>>><<<
  9380 1726773047.22599: stdout chunk (state=3):
>>><<<
  9380 1726773047.22617: done transferring module to remote
  9380 1726773047.22626: _low_level_execute_command(): starting
  9380 1726773047.22631: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773046.906255-9380-141727179191489/ /root/.ansible/tmp/ansible-tmp-1726773046.906255-9380-141727179191489/AnsiballZ_file.py && sleep 0'
  9380 1726773047.25098: stderr chunk (state=2):
>>><<<
  9380 1726773047.25110: stdout chunk (state=2):
>>><<<
  9380 1726773047.25125: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9380 1726773047.25130: _low_level_execute_command(): starting
  9380 1726773047.25135: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773046.906255-9380-141727179191489/AnsiballZ_file.py && sleep 0'
  9380 1726773047.41543: stdout chunk (state=2):
>>>
{"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmpq_oqynit", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}}
<<<
  9380 1726773047.42803: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  9380 1726773047.42816: stdout chunk (state=3):
>>><<<
  9380 1726773047.42827: stderr chunk (state=3):
>>><<<
  9380 1726773047.42842: _low_level_execute_command() done: rc=0, stdout=
{"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmpq_oqynit", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}}
, stderr=Shared connection to 10.31.47.99 closed.
  9380 1726773047.42881: done with _execute_module (ansible.legacy.file, {'mode': '0600', 'dest': '/etc/tuned/profile_mode', '_original_basename': 'tmpq_oqynit', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773046.906255-9380-141727179191489/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  9380 1726773047.42896: _low_level_execute_command(): starting
  9380 1726773047.42902: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773046.906255-9380-141727179191489/ > /dev/null 2>&1 && sleep 0'
  9380 1726773047.45489: stderr chunk (state=2):
>>><<<
  9380 1726773047.45501: stdout chunk (state=2):
>>><<<
  9380 1726773047.45520: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9380 1726773047.45528: handler run complete
  9380 1726773047.45557: attempt loop complete, returning result
  9380 1726773047.45563: _execute() done
  9380 1726773047.45567: dumping result to json
  9380 1726773047.45573: done dumping result, returning
  9380 1726773047.45582: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual [0affffe7-6841-6cfb-81ae-000000000205]
  9380 1726773047.45590: sending task result for task 0affffe7-6841-6cfb-81ae-000000000205
  9380 1726773047.45632: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000205
  9380 1726773047.45636: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "changed": false,
    "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce",
    "dest": "/etc/tuned/profile_mode",
    "gid": 0,
    "group": "root",
    "mode": "0600",
    "owner": "root",
    "path": "/etc/tuned/profile_mode",
    "secontext": "system_u:object_r:tuned_etc_t:s0",
    "size": 7,
    "state": "file",
    "uid": 0
}
  8303 1726773047.46171: no more pending results, returning what we have
  8303 1726773047.46175: results queue empty
  8303 1726773047.46176: checking for any_errors_fatal
  8303 1726773047.46187: done checking for any_errors_fatal
  8303 1726773047.46187: checking for max_fail_percentage
  8303 1726773047.46189: done checking for max_fail_percentage
  8303 1726773047.46190: checking to see if all hosts have failed and the running result is not ok
  8303 1726773047.46190: done checking to see if all hosts have failed
  8303 1726773047.46191: getting the remaining hosts for this loop
  8303 1726773047.46192: done getting the remaining hosts for this loop
  8303 1726773047.46195: getting the next task for host managed_node3
  8303 1726773047.46201: done getting next task for host managed_node3
  8303 1726773047.46204:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config
  8303 1726773047.46206:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773047.46216: getting variables
  8303 1726773047.46217: in VariableManager get_vars()
  8303 1726773047.46248: Calling all_inventory to load vars for managed_node3
  8303 1726773047.46251: Calling groups_inventory to load vars for managed_node3
  8303 1726773047.46253: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773047.46261: Calling all_plugins_play to load vars for managed_node3
  8303 1726773047.46264: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773047.46266: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773047.46318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773047.46367: done with get_vars()
  8303 1726773047.46376: done getting variables

TASK [fedora.linux_system_roles.kernel_settings : Get current config] **********
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:107
Thursday 19 September 2024  15:10:47 -0400 (0:00:00.598)       0:00:24.043 **** 
  8303 1726773047.46455: entering _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config
  8303 1726773047.46647: worker is 1 (out of 1 available)
  8303 1726773047.46659: exiting _queue_task() for managed_node3/fedora.linux_system_roles.kernel_settings_get_config
  8303 1726773047.46672: done queuing things up, now waiting for results queue to drain
  8303 1726773047.46673: waiting for pending results...
  9394 1726773047.46886: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get current config
  9394 1726773047.47017: in run() - task 0affffe7-6841-6cfb-81ae-000000000206
  9394 1726773047.47036: variable 'ansible_search_path' from source: unknown
  9394 1726773047.47040: variable 'ansible_search_path' from source: unknown
  9394 1726773047.47078: calling self._execute()
  9394 1726773047.47278: variable 'ansible_host' from source: host vars for 'managed_node3'
  9394 1726773047.47291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9394 1726773047.47300: variable 'omit' from source: magic vars
  9394 1726773047.47401: variable 'omit' from source: magic vars
  9394 1726773047.47455: variable 'omit' from source: magic vars
  9394 1726773047.47482: variable '__kernel_settings_profile_filename' from source: role '' all vars
  9394 1726773047.47768: variable '__kernel_settings_profile_filename' from source: role '' all vars
  9394 1726773047.47903: variable '__kernel_settings_profile_dir' from source: role '' all vars
  9394 1726773047.47989: variable '__kernel_settings_profile_parent' from source: set_fact
  9394 1726773047.47999: variable '__kernel_settings_tuned_profile' from source: role '' all vars
  9394 1726773047.48040: variable 'omit' from source: magic vars
  9394 1726773047.48079: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  9394 1726773047.48114: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  9394 1726773047.48135: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  9394 1726773047.48153: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9394 1726773047.48166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9394 1726773047.48197: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  9394 1726773047.48203: variable 'ansible_host' from source: host vars for 'managed_node3'
  9394 1726773047.48207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9394 1726773047.48304: Set connection var ansible_pipelining to False
  9394 1726773047.48316: Set connection var ansible_timeout to 10
  9394 1726773047.48322: Set connection var ansible_module_compression to ZIP_DEFLATED
  9394 1726773047.48327: Set connection var ansible_shell_executable to /bin/sh
  9394 1726773047.48330: Set connection var ansible_connection to ssh
  9394 1726773047.48338: Set connection var ansible_shell_type to sh
  9394 1726773047.48357: variable 'ansible_shell_executable' from source: unknown
  9394 1726773047.48362: variable 'ansible_connection' from source: unknown
  9394 1726773047.48366: variable 'ansible_module_compression' from source: unknown
  9394 1726773047.48369: variable 'ansible_shell_type' from source: unknown
  9394 1726773047.48372: variable 'ansible_shell_executable' from source: unknown
  9394 1726773047.48374: variable 'ansible_host' from source: host vars for 'managed_node3'
  9394 1726773047.48377: variable 'ansible_pipelining' from source: unknown
  9394 1726773047.48380: variable 'ansible_timeout' from source: unknown
  9394 1726773047.48383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9394 1726773047.48548: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action)
  9394 1726773047.48559: variable 'omit' from source: magic vars
  9394 1726773047.48565: starting attempt loop
  9394 1726773047.48568: running the handler
  9394 1726773047.48580: _low_level_execute_command(): starting
  9394 1726773047.48589: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  9394 1726773047.51270: stdout chunk (state=2):
>>>/root
<<<
  9394 1726773047.51415: stderr chunk (state=3):
>>><<<
  9394 1726773047.51424: stdout chunk (state=3):
>>><<<
  9394 1726773047.51447: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  9394 1726773047.51464: _low_level_execute_command(): starting
  9394 1726773047.51470: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773047.5145717-9394-108884448351543 `" && echo ansible-tmp-1726773047.5145717-9394-108884448351543="` echo /root/.ansible/tmp/ansible-tmp-1726773047.5145717-9394-108884448351543 `" ) && sleep 0'
  9394 1726773047.54186: stdout chunk (state=2):
>>>ansible-tmp-1726773047.5145717-9394-108884448351543=/root/.ansible/tmp/ansible-tmp-1726773047.5145717-9394-108884448351543
<<<
  9394 1726773047.54340: stderr chunk (state=3):
>>><<<
  9394 1726773047.54349: stdout chunk (state=3):
>>><<<
  9394 1726773047.54371: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773047.5145717-9394-108884448351543=/root/.ansible/tmp/ansible-tmp-1726773047.5145717-9394-108884448351543
, stderr=
  9394 1726773047.54423: variable 'ansible_module_compression' from source: unknown
  9394 1726773047.54463: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED
  9394 1726773047.54505: variable 'ansible_facts' from source: unknown
  9394 1726773047.54601: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773047.5145717-9394-108884448351543/AnsiballZ_kernel_settings_get_config.py
  9394 1726773047.56060: Sending initial data
  9394 1726773047.56071: Sent initial data (173 bytes)
  9394 1726773047.58235: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpfxa3wp_7 /root/.ansible/tmp/ansible-tmp-1726773047.5145717-9394-108884448351543/AnsiballZ_kernel_settings_get_config.py
<<<
  9394 1726773047.59716: stderr chunk (state=3):
>>><<<
  9394 1726773047.59727: stdout chunk (state=3):
>>><<<
  9394 1726773047.59751: done transferring module to remote
  9394 1726773047.59763: _low_level_execute_command(): starting
  9394 1726773047.59768: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773047.5145717-9394-108884448351543/ /root/.ansible/tmp/ansible-tmp-1726773047.5145717-9394-108884448351543/AnsiballZ_kernel_settings_get_config.py && sleep 0'
  9394 1726773047.62546: stderr chunk (state=2):
>>><<<
  9394 1726773047.62557: stdout chunk (state=2):
>>><<<
  9394 1726773047.62573: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9394 1726773047.62578: _low_level_execute_command(): starting
  9394 1726773047.62583: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773047.5145717-9394-108884448351543/AnsiballZ_kernel_settings_get_config.py && sleep 0'
  9394 1726773047.78191: stdout chunk (state=2):
>>>
{"changed": false, "data": {"main": {"summary": "kernel settings"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}}
<<<
  9394 1726773047.79294: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  9394 1726773047.79352: stderr chunk (state=3):
>>><<<
  9394 1726773047.79361: stdout chunk (state=3):
>>><<<
  9394 1726773047.79389: _low_level_execute_command() done: rc=0, stdout=
{"changed": false, "data": {"main": {"summary": "kernel settings"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}}
, stderr=Shared connection to 10.31.47.99 closed.
  9394 1726773047.79415: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773047.5145717-9394-108884448351543/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  9394 1726773047.79427: _low_level_execute_command(): starting
  9394 1726773047.79433: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773047.5145717-9394-108884448351543/ > /dev/null 2>&1 && sleep 0'
  9394 1726773047.82134: stderr chunk (state=2):
>>><<<
  9394 1726773047.82144: stdout chunk (state=2):
>>><<<
  9394 1726773047.82160: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9394 1726773047.82169: handler run complete
  9394 1726773047.82183: attempt loop complete, returning result
  9394 1726773047.82188: _execute() done
  9394 1726773047.82191: dumping result to json
  9394 1726773047.82193: done dumping result, returning
  9394 1726773047.82202: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Get current config [0affffe7-6841-6cfb-81ae-000000000206]
  9394 1726773047.82207: sending task result for task 0affffe7-6841-6cfb-81ae-000000000206
  9394 1726773047.82230: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000206
  9394 1726773047.82232: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "changed": false,
    "data": {
        "main": {
            "summary": "kernel settings"
        }
    }
}
  8303 1726773047.82626: no more pending results, returning what we have
  8303 1726773047.82629: results queue empty
  8303 1726773047.82630: checking for any_errors_fatal
  8303 1726773047.82635: done checking for any_errors_fatal
  8303 1726773047.82636: checking for max_fail_percentage
  8303 1726773047.82637: done checking for max_fail_percentage
  8303 1726773047.82637: checking to see if all hosts have failed and the running result is not ok
  8303 1726773047.82638: done checking to see if all hosts have failed
  8303 1726773047.82639: getting the remaining hosts for this loop
  8303 1726773047.82640: done getting the remaining hosts for this loop
  8303 1726773047.82642: getting the next task for host managed_node3
  8303 1726773047.82647: done getting next task for host managed_node3
  8303 1726773047.82650:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings
  8303 1726773047.82654:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773047.82662: getting variables
  8303 1726773047.82664: in VariableManager get_vars()
  8303 1726773047.82693: Calling all_inventory to load vars for managed_node3
  8303 1726773047.82696: Calling groups_inventory to load vars for managed_node3
  8303 1726773047.82698: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773047.82706: Calling all_plugins_play to load vars for managed_node3
  8303 1726773047.82709: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773047.82711: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773047.82761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773047.82812: done with get_vars()
  8303 1726773047.82820: done getting variables
  8303 1726773047.82879: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Apply kernel settings] *******
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112
Thursday 19 September 2024  15:10:47 -0400 (0:00:00.364)       0:00:24.407 **** 
  8303 1726773047.82912: entering _queue_task() for managed_node3/template
  8303 1726773047.83113: worker is 1 (out of 1 available)
  8303 1726773047.83128: exiting _queue_task() for managed_node3/template
  8303 1726773047.83141: done queuing things up, now waiting for results queue to drain
  8303 1726773047.83143: waiting for pending results...
  9419 1726773047.83347: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings
  9419 1726773047.83475: in run() - task 0affffe7-6841-6cfb-81ae-000000000207
  9419 1726773047.83492: variable 'ansible_search_path' from source: unknown
  9419 1726773047.83496: variable 'ansible_search_path' from source: unknown
  9419 1726773047.83526: calling self._execute()
  9419 1726773047.83589: variable 'ansible_host' from source: host vars for 'managed_node3'
  9419 1726773047.83600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9419 1726773047.83611: variable 'omit' from source: magic vars
  9419 1726773047.83688: variable 'omit' from source: magic vars
  9419 1726773047.83728: variable 'omit' from source: magic vars
  9419 1726773047.83968: variable '__kernel_settings_profile_src' from source: role '' all vars
  9419 1726773047.83977: variable '__kernel_settings_tuned_profile' from source: role '' all vars
  9419 1726773047.84036: variable '__kernel_settings_tuned_profile' from source: role '' all vars
  9419 1726773047.84056: variable '__kernel_settings_profile_filename' from source: role '' all vars
  9419 1726773047.84105: variable '__kernel_settings_profile_filename' from source: role '' all vars
  9419 1726773047.84154: variable '__kernel_settings_profile_dir' from source: role '' all vars
  9419 1726773047.84218: variable '__kernel_settings_profile_parent' from source: set_fact
  9419 1726773047.84226: variable '__kernel_settings_tuned_profile' from source: role '' all vars
  9419 1726773047.84248: variable 'omit' from source: magic vars
  9419 1726773047.84281: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  9419 1726773047.84308: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  9419 1726773047.84324: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  9419 1726773047.84336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9419 1726773047.84346: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9419 1726773047.84369: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  9419 1726773047.84373: variable 'ansible_host' from source: host vars for 'managed_node3'
  9419 1726773047.84376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9419 1726773047.84441: Set connection var ansible_pipelining to False
  9419 1726773047.84450: Set connection var ansible_timeout to 10
  9419 1726773047.84454: Set connection var ansible_module_compression to ZIP_DEFLATED
  9419 1726773047.84458: Set connection var ansible_shell_executable to /bin/sh
  9419 1726773047.84459: Set connection var ansible_connection to ssh
  9419 1726773047.84463: Set connection var ansible_shell_type to sh
  9419 1726773047.84478: variable 'ansible_shell_executable' from source: unknown
  9419 1726773047.84481: variable 'ansible_connection' from source: unknown
  9419 1726773047.84483: variable 'ansible_module_compression' from source: unknown
  9419 1726773047.84487: variable 'ansible_shell_type' from source: unknown
  9419 1726773047.84489: variable 'ansible_shell_executable' from source: unknown
  9419 1726773047.84492: variable 'ansible_host' from source: host vars for 'managed_node3'
  9419 1726773047.84496: variable 'ansible_pipelining' from source: unknown
  9419 1726773047.84499: variable 'ansible_timeout' from source: unknown
  9419 1726773047.84503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9419 1726773047.84604: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  9419 1726773047.84615: variable 'omit' from source: magic vars
  9419 1726773047.84621: starting attempt loop
  9419 1726773047.84625: running the handler
  9419 1726773047.84634: _low_level_execute_command(): starting
  9419 1726773047.84641: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  9419 1726773047.86993: stdout chunk (state=2):
>>>/root
<<<
  9419 1726773047.87111: stderr chunk (state=3):
>>><<<
  9419 1726773047.87119: stdout chunk (state=3):
>>><<<
  9419 1726773047.87143: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  9419 1726773047.87157: _low_level_execute_command(): starting
  9419 1726773047.87163: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773047.871516-9419-213605785040230 `" && echo ansible-tmp-1726773047.871516-9419-213605785040230="` echo /root/.ansible/tmp/ansible-tmp-1726773047.871516-9419-213605785040230 `" ) && sleep 0'
  9419 1726773047.89687: stdout chunk (state=2):
>>>ansible-tmp-1726773047.871516-9419-213605785040230=/root/.ansible/tmp/ansible-tmp-1726773047.871516-9419-213605785040230
<<<
  9419 1726773047.89824: stderr chunk (state=3):
>>><<<
  9419 1726773047.89832: stdout chunk (state=3):
>>><<<
  9419 1726773047.89849: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773047.871516-9419-213605785040230=/root/.ansible/tmp/ansible-tmp-1726773047.871516-9419-213605785040230
, stderr=
  9419 1726773047.89866: evaluation_path:
	/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings
	/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks
  9419 1726773047.89888: search_path:
	/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/templates/kernel_settings.j2
	/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/kernel_settings.j2
	/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/templates/kernel_settings.j2
	/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/kernel_settings.j2
	/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/templates/kernel_settings.j2
	/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/kernel_settings.j2
  9419 1726773047.89910: variable 'ansible_search_path' from source: unknown
  9419 1726773047.90529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name
  9419 1726773047.91998: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py
  9419 1726773047.92057: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py
  9419 1726773047.92091: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py
  9419 1726773047.92119: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py
  9419 1726773047.92141: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py
  9419 1726773047.92337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  9419 1726773047.92361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  9419 1726773047.92386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  9419 1726773047.92417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  9419 1726773047.92429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  9419 1726773047.92658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  9419 1726773047.92680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  9419 1726773047.92699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  9419 1726773047.92727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  9419 1726773047.92738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  9419 1726773047.92999: variable 'ansible_managed' from source: unknown
  9419 1726773047.93007: variable '__sections' from source: task vars
  9419 1726773047.93099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  9419 1726773047.93118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  9419 1726773047.93136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  9419 1726773047.93165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  9419 1726773047.93179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  9419 1726773047.93251: variable 'kernel_settings_sysctl' from source: include params
  9419 1726773047.93258: variable '__kernel_settings_state_empty' from source: role '' all vars
  9419 1726773047.93265: variable '__kernel_settings_previous_replaced' from source: role '' all vars
  9419 1726773047.93299: variable '__sysctl_old' from source: task vars
  9419 1726773047.93345: variable '__sysctl_old' from source: task vars
  9419 1726773047.93492: variable 'kernel_settings_purge' from source: include params
  9419 1726773047.93499: variable 'kernel_settings_sysctl' from source: include params
  9419 1726773047.93504: variable '__kernel_settings_state_empty' from source: role '' all vars
  9419 1726773047.93509: variable '__kernel_settings_previous_replaced' from source: role '' all vars
  9419 1726773047.93514: variable '__kernel_settings_profile_contents' from source: set_fact
  9419 1726773047.93642: variable 'kernel_settings_sysfs' from source: include params
  9419 1726773047.93649: variable '__kernel_settings_state_empty' from source: role '' all vars
  9419 1726773047.93655: variable '__kernel_settings_previous_replaced' from source: role '' all vars
  9419 1726773047.93670: variable '__sysfs_old' from source: task vars
  9419 1726773047.93712: variable '__sysfs_old' from source: task vars
  9419 1726773047.93846: variable 'kernel_settings_purge' from source: include params
  9419 1726773047.93851: variable 'kernel_settings_sysfs' from source: include params
  9419 1726773047.93854: variable '__kernel_settings_state_empty' from source: role '' all vars
  9419 1726773047.93857: variable '__kernel_settings_previous_replaced' from source: role '' all vars
  9419 1726773047.93860: variable '__kernel_settings_profile_contents' from source: set_fact
  9419 1726773047.93877: variable 'kernel_settings_systemd_cpu_affinity' from source: include params
  9419 1726773047.93884: variable '__systemd_old' from source: task vars
  9419 1726773047.93932: variable '__systemd_old' from source: task vars
  9419 1726773047.94061: variable 'kernel_settings_purge' from source: include params
  9419 1726773047.94070: variable 'kernel_settings_systemd_cpu_affinity' from source: include params
  9419 1726773047.94075: variable '__kernel_settings_state_absent' from source: role '' all vars
  9419 1726773047.94080: variable '__kernel_settings_profile_contents' from source: set_fact
  9419 1726773047.94093: variable 'kernel_settings_transparent_hugepages' from source: include params
  9419 1726773047.94098: variable 'kernel_settings_transparent_hugepages_defrag' from source: include params
  9419 1726773047.94103: variable '__trans_huge_old' from source: task vars
  9419 1726773047.94144: variable '__trans_huge_old' from source: task vars
  9419 1726773047.94275: variable 'kernel_settings_purge' from source: include params
  9419 1726773047.94281: variable 'kernel_settings_transparent_hugepages' from source: include params
  9419 1726773047.94288: variable '__kernel_settings_state_absent' from source: role '' all vars
  9419 1726773047.94293: variable '__kernel_settings_profile_contents' from source: set_fact
  9419 1726773047.94302: variable '__trans_defrag_old' from source: task vars
  9419 1726773047.94342: variable '__trans_defrag_old' from source: task vars
  9419 1726773047.94477: variable 'kernel_settings_purge' from source: include params
  9419 1726773047.94484: variable 'kernel_settings_transparent_hugepages_defrag' from source: include params
  9419 1726773047.94490: variable '__kernel_settings_state_absent' from source: role '' all vars
  9419 1726773047.94496: variable '__kernel_settings_profile_contents' from source: set_fact
  9419 1726773047.94513: variable '__kernel_settings_state_absent' from source: role '' all vars
  9419 1726773047.94524: variable '__kernel_settings_state_absent' from source: role '' all vars
  9419 1726773047.94530: variable '__kernel_settings_state_absent' from source: role '' all vars
  9419 1726773047.95178: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  9419 1726773047.95222: variable 'ansible_module_compression' from source: unknown
  9419 1726773047.95262: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED
  9419 1726773047.95287: variable 'ansible_facts' from source: unknown
  9419 1726773047.95353: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773047.871516-9419-213605785040230/AnsiballZ_stat.py
  9419 1726773047.95447: Sending initial data
  9419 1726773047.95454: Sent initial data (150 bytes)
  9419 1726773047.98097: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpzqq0i9yg /root/.ansible/tmp/ansible-tmp-1726773047.871516-9419-213605785040230/AnsiballZ_stat.py
<<<
  9419 1726773047.99310: stderr chunk (state=3):
>>><<<
  9419 1726773047.99320: stdout chunk (state=3):
>>><<<
  9419 1726773047.99340: done transferring module to remote
  9419 1726773047.99351: _low_level_execute_command(): starting
  9419 1726773047.99357: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773047.871516-9419-213605785040230/ /root/.ansible/tmp/ansible-tmp-1726773047.871516-9419-213605785040230/AnsiballZ_stat.py && sleep 0'
  9419 1726773048.01989: stderr chunk (state=2):
>>><<<
  9419 1726773048.02001: stdout chunk (state=2):
>>><<<
  9419 1726773048.02018: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9419 1726773048.02025: _low_level_execute_command(): starting
  9419 1726773048.02030: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773047.871516-9419-213605785040230/AnsiballZ_stat.py && sleep 0'
  9419 1726773048.18305: stdout chunk (state=2):
>>>
{"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 86, "inode": 515899586, "dev": 51713, "nlink": 1, "atime": 1726773038.2880044, "mtime": 1726773036.897999, "ctime": 1726773037.1629999, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "mimetype": "text/plain", "charset": "us-ascii", "version": "1771849899", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}}
<<<
  9419 1726773048.19449: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  9419 1726773048.19500: stderr chunk (state=3):
>>><<<
  9419 1726773048.19510: stdout chunk (state=3):
>>><<<
  9419 1726773048.19527: _low_level_execute_command() done: rc=0, stdout=
{"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 86, "inode": 515899586, "dev": 51713, "nlink": 1, "atime": 1726773038.2880044, "mtime": 1726773036.897999, "ctime": 1726773037.1629999, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "mimetype": "text/plain", "charset": "us-ascii", "version": "1771849899", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}}
, stderr=Shared connection to 10.31.47.99 closed.
  9419 1726773048.19576: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/kernel_settings/tuned.conf', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773047.871516-9419-213605785040230/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  9419 1726773048.19607: variable 'ansible_module_compression' from source: unknown
  9419 1726773048.19641: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED
  9419 1726773048.19660: variable 'ansible_facts' from source: unknown
  9419 1726773048.19720: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773047.871516-9419-213605785040230/AnsiballZ_file.py
  9419 1726773048.19822: Sending initial data
  9419 1726773048.19829: Sent initial data (150 bytes)
  9419 1726773048.22611: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmp56amfbjb /root/.ansible/tmp/ansible-tmp-1726773047.871516-9419-213605785040230/AnsiballZ_file.py
<<<
  9419 1726773048.23821: stderr chunk (state=3):
>>><<<
  9419 1726773048.23833: stdout chunk (state=3):
>>><<<
  9419 1726773048.23854: done transferring module to remote
  9419 1726773048.23863: _low_level_execute_command(): starting
  9419 1726773048.23869: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773047.871516-9419-213605785040230/ /root/.ansible/tmp/ansible-tmp-1726773047.871516-9419-213605785040230/AnsiballZ_file.py && sleep 0'
  9419 1726773048.26286: stderr chunk (state=2):
>>><<<
  9419 1726773048.26299: stdout chunk (state=2):
>>><<<
  9419 1726773048.26315: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9419 1726773048.26319: _low_level_execute_command(): starting
  9419 1726773048.26325: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773047.871516-9419-213605785040230/AnsiballZ_file.py && sleep 0'
  9419 1726773048.42418: stdout chunk (state=2):
>>>
{"path": "/etc/tuned/kernel_settings/tuned.conf", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings/tuned.conf"}, "after": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "invocation": {"module_args": {"mode": "0644", "dest": "/etc/tuned/kernel_settings/tuned.conf", "_original_basename": "kernel_settings.j2", "recurse": false, "state": "file", "path": "/etc/tuned/kernel_settings/tuned.conf", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}}
<<<
  9419 1726773048.43543: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  9419 1726773048.43593: stderr chunk (state=3):
>>><<<
  9419 1726773048.43600: stdout chunk (state=3):
>>><<<
  9419 1726773048.43617: _low_level_execute_command() done: rc=0, stdout=
{"path": "/etc/tuned/kernel_settings/tuned.conf", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings/tuned.conf"}, "after": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "invocation": {"module_args": {"mode": "0644", "dest": "/etc/tuned/kernel_settings/tuned.conf", "_original_basename": "kernel_settings.j2", "recurse": false, "state": "file", "path": "/etc/tuned/kernel_settings/tuned.conf", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}}
, stderr=Shared connection to 10.31.47.99 closed.
  9419 1726773048.43645: done with _execute_module (ansible.legacy.file, {'mode': '0644', 'dest': '/etc/tuned/kernel_settings/tuned.conf', '_original_basename': 'kernel_settings.j2', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773047.871516-9419-213605785040230/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  9419 1726773048.43675: _low_level_execute_command(): starting
  9419 1726773048.43682: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773047.871516-9419-213605785040230/ > /dev/null 2>&1 && sleep 0'
  9419 1726773048.46206: stderr chunk (state=2):
>>><<<
  9419 1726773048.46216: stdout chunk (state=2):
>>><<<
  9419 1726773048.46230: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9419 1726773048.46239: handler run complete
  9419 1726773048.46259: attempt loop complete, returning result
  9419 1726773048.46264: _execute() done
  9419 1726773048.46272: dumping result to json
  9419 1726773048.46278: done dumping result, returning
  9419 1726773048.46287: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings [0affffe7-6841-6cfb-81ae-000000000207]
  9419 1726773048.46295: sending task result for task 0affffe7-6841-6cfb-81ae-000000000207
  9419 1726773048.46337: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000207
  9419 1726773048.46342: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "changed": false,
    "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069",
    "dest": "/etc/tuned/kernel_settings/tuned.conf",
    "gid": 0,
    "group": "root",
    "mode": "0644",
    "owner": "root",
    "path": "/etc/tuned/kernel_settings/tuned.conf",
    "secontext": "system_u:object_r:tuned_etc_t:s0",
    "size": 86,
    "state": "file",
    "uid": 0
}
  8303 1726773048.46573: no more pending results, returning what we have
  8303 1726773048.46576: results queue empty
  8303 1726773048.46577: checking for any_errors_fatal
  8303 1726773048.46582: done checking for any_errors_fatal
  8303 1726773048.46583: checking for max_fail_percentage
  8303 1726773048.46584: done checking for max_fail_percentage
  8303 1726773048.46586: checking to see if all hosts have failed and the running result is not ok
  8303 1726773048.46587: done checking to see if all hosts have failed
  8303 1726773048.46587: getting the remaining hosts for this loop
  8303 1726773048.46588: done getting the remaining hosts for this loop
  8303 1726773048.46591: getting the next task for host managed_node3
  8303 1726773048.46596: done getting next task for host managed_node3
  8303 1726773048.46598:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes
  8303 1726773048.46600:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773048.46607: getting variables
  8303 1726773048.46608: in VariableManager get_vars()
  8303 1726773048.46632: Calling all_inventory to load vars for managed_node3
  8303 1726773048.46634: Calling groups_inventory to load vars for managed_node3
  8303 1726773048.46635: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773048.46641: Calling all_plugins_play to load vars for managed_node3
  8303 1726773048.46643: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773048.46645: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773048.46679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773048.46715: done with get_vars()
  8303 1726773048.46721: done getting variables
  8303 1726773048.46760: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149
Thursday 19 September 2024  15:10:48 -0400 (0:00:00.638)       0:00:25.046 **** 
  8303 1726773048.46782: entering _queue_task() for managed_node3/service
  8303 1726773048.46947: worker is 1 (out of 1 available)
  8303 1726773048.46961: exiting _queue_task() for managed_node3/service
  8303 1726773048.46973: done queuing things up, now waiting for results queue to drain
  8303 1726773048.46974: waiting for pending results...
  9444 1726773048.47095: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes
  9444 1726773048.47207: in run() - task 0affffe7-6841-6cfb-81ae-000000000208
  9444 1726773048.47223: variable 'ansible_search_path' from source: unknown
  9444 1726773048.47228: variable 'ansible_search_path' from source: unknown
  9444 1726773048.47264: variable '__kernel_settings_services' from source: include_vars
  9444 1726773048.47513: variable '__kernel_settings_services' from source: include_vars
  9444 1726773048.47576: variable 'omit' from source: magic vars
  9444 1726773048.47675: variable 'ansible_host' from source: host vars for 'managed_node3'
  9444 1726773048.47688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9444 1726773048.47697: variable 'omit' from source: magic vars
  9444 1726773048.47942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name
  9444 1726773048.48177: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py
  9444 1726773048.48210: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py
  9444 1726773048.48238: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py
  9444 1726773048.48277: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py
  9444 1726773048.48379: variable '__kernel_settings_register_profile' from source: set_fact
  9444 1726773048.48395: variable '__kernel_settings_register_mode' from source: set_fact
  9444 1726773048.48414: Evaluated conditional (__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed): False
  9444 1726773048.48418: when evaluation is False, skipping this task
  9444 1726773048.48446: variable 'item' from source: unknown
  9444 1726773048.48516: variable 'item' from source: unknown
skipping: [managed_node3] => (item=tuned)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "false_condition": "__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed",
    "item": "tuned",
    "skip_reason": "Conditional result was False"
}
  9444 1726773048.48546: dumping result to json
  9444 1726773048.48551: done dumping result, returning
  9444 1726773048.48556: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes [0affffe7-6841-6cfb-81ae-000000000208]
  9444 1726773048.48562: sending task result for task 0affffe7-6841-6cfb-81ae-000000000208
  9444 1726773048.48589: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000208
  9444 1726773048.48592: WORKER PROCESS EXITING
skipping: [managed_node3] => {
    "changed": false
}

MSG:

All items skipped
  8303 1726773048.49001: no more pending results, returning what we have
  8303 1726773048.49003: results queue empty
  8303 1726773048.49004: checking for any_errors_fatal
  8303 1726773048.49011: done checking for any_errors_fatal
  8303 1726773048.49012: checking for max_fail_percentage
  8303 1726773048.49013: done checking for max_fail_percentage
  8303 1726773048.49013: checking to see if all hosts have failed and the running result is not ok
  8303 1726773048.49013: done checking to see if all hosts have failed
  8303 1726773048.49014: getting the remaining hosts for this loop
  8303 1726773048.49015: done getting the remaining hosts for this loop
  8303 1726773048.49017: getting the next task for host managed_node3
  8303 1726773048.49021: done getting next task for host managed_node3
  8303 1726773048.49023:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings
  8303 1726773048.49026:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773048.49035: getting variables
  8303 1726773048.49036: in VariableManager get_vars()
  8303 1726773048.49060: Calling all_inventory to load vars for managed_node3
  8303 1726773048.49062: Calling groups_inventory to load vars for managed_node3
  8303 1726773048.49063: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773048.49069: Calling all_plugins_play to load vars for managed_node3
  8303 1726773048.49071: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773048.49073: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773048.49110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773048.49141: done with get_vars()
  8303 1726773048.49148: done getting variables
  8303 1726773048.49188: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Tuned apply settings] ********
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157
Thursday 19 September 2024  15:10:48 -0400 (0:00:00.024)       0:00:25.070 **** 
  8303 1726773048.49210: entering _queue_task() for managed_node3/command
  8303 1726773048.49372: worker is 1 (out of 1 available)
  8303 1726773048.49388: exiting _queue_task() for managed_node3/command
  8303 1726773048.49401: done queuing things up, now waiting for results queue to drain
  8303 1726773048.49402: waiting for pending results...
  9445 1726773048.49530: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings
  9445 1726773048.49640: in run() - task 0affffe7-6841-6cfb-81ae-000000000209
  9445 1726773048.49663: variable 'ansible_search_path' from source: unknown
  9445 1726773048.49670: variable 'ansible_search_path' from source: unknown
  9445 1726773048.49705: calling self._execute()
  9445 1726773048.49771: variable 'ansible_host' from source: host vars for 'managed_node3'
  9445 1726773048.49780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9445 1726773048.49791: variable 'omit' from source: magic vars
  9445 1726773048.50153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name
  9445 1726773048.50400: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py
  9445 1726773048.50442: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py
  9445 1726773048.50476: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py
  9445 1726773048.50510: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py
  9445 1726773048.50625: variable '__kernel_settings_register_profile' from source: set_fact
  9445 1726773048.50654: Evaluated conditional (not __kernel_settings_register_profile is changed): True
  9445 1726773048.50773: variable '__kernel_settings_register_mode' from source: set_fact
  9445 1726773048.50784: Evaluated conditional (not __kernel_settings_register_mode is changed): True
  9445 1726773048.50882: variable '__kernel_settings_register_apply' from source: set_fact
  9445 1726773048.50895: Evaluated conditional (__kernel_settings_register_apply is changed): False
  9445 1726773048.50900: when evaluation is False, skipping this task
  9445 1726773048.50903: _execute() done
  9445 1726773048.50907: dumping result to json
  9445 1726773048.50910: done dumping result, returning
  9445 1726773048.50916: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings [0affffe7-6841-6cfb-81ae-000000000209]
  9445 1726773048.50922: sending task result for task 0affffe7-6841-6cfb-81ae-000000000209
  9445 1726773048.50945: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000209
  9445 1726773048.50947: WORKER PROCESS EXITING
skipping: [managed_node3] => {
    "changed": false,
    "false_condition": "__kernel_settings_register_apply is changed",
    "skip_reason": "Conditional result was False"
}
  8303 1726773048.51092: no more pending results, returning what we have
  8303 1726773048.51095: results queue empty
  8303 1726773048.51096: checking for any_errors_fatal
  8303 1726773048.51102: done checking for any_errors_fatal
  8303 1726773048.51102: checking for max_fail_percentage
  8303 1726773048.51104: done checking for max_fail_percentage
  8303 1726773048.51104: checking to see if all hosts have failed and the running result is not ok
  8303 1726773048.51105: done checking to see if all hosts have failed
  8303 1726773048.51105: getting the remaining hosts for this loop
  8303 1726773048.51106: done getting the remaining hosts for this loop
  8303 1726773048.51109: getting the next task for host managed_node3
  8303 1726773048.51114: done getting next task for host managed_node3
  8303 1726773048.51117:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings
  8303 1726773048.51120:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773048.51132: getting variables
  8303 1726773048.51133: in VariableManager get_vars()
  8303 1726773048.51156: Calling all_inventory to load vars for managed_node3
  8303 1726773048.51158: Calling groups_inventory to load vars for managed_node3
  8303 1726773048.51159: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773048.51165: Calling all_plugins_play to load vars for managed_node3
  8303 1726773048.51167: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773048.51170: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773048.51206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773048.51239: done with get_vars()
  8303 1726773048.51244: done getting variables

TASK [fedora.linux_system_roles.kernel_settings : Verify settings] *************
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:166
Thursday 19 September 2024  15:10:48 -0400 (0:00:00.020)       0:00:25.091 **** 
  8303 1726773048.51309: entering _queue_task() for managed_node3/include_tasks
  8303 1726773048.51464: worker is 1 (out of 1 available)
  8303 1726773048.51478: exiting _queue_task() for managed_node3/include_tasks
  8303 1726773048.51490: done queuing things up, now waiting for results queue to drain
  8303 1726773048.51492: waiting for pending results...
  9447 1726773048.51604: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Verify settings
  9447 1726773048.51711: in run() - task 0affffe7-6841-6cfb-81ae-00000000020a
  9447 1726773048.51726: variable 'ansible_search_path' from source: unknown
  9447 1726773048.51730: variable 'ansible_search_path' from source: unknown
  9447 1726773048.51756: calling self._execute()
  9447 1726773048.51809: variable 'ansible_host' from source: host vars for 'managed_node3'
  9447 1726773048.51817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9447 1726773048.51823: variable 'omit' from source: magic vars
  9447 1726773048.52137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name
  9447 1726773048.52316: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py
  9447 1726773048.52349: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py
  9447 1726773048.52379: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py
  9447 1726773048.52469: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py
  9447 1726773048.52550: variable '__kernel_settings_register_apply' from source: set_fact
  9447 1726773048.52574: Evaluated conditional (__kernel_settings_register_apply is changed): False
  9447 1726773048.52579: when evaluation is False, skipping this task
  9447 1726773048.52582: _execute() done
  9447 1726773048.52587: dumping result to json
  9447 1726773048.52591: done dumping result, returning
  9447 1726773048.52595: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Verify settings [0affffe7-6841-6cfb-81ae-00000000020a]
  9447 1726773048.52600: sending task result for task 0affffe7-6841-6cfb-81ae-00000000020a
  9447 1726773048.52618: done sending task result for task 0affffe7-6841-6cfb-81ae-00000000020a
  9447 1726773048.52620: WORKER PROCESS EXITING
skipping: [managed_node3] => {
    "changed": false,
    "false_condition": "__kernel_settings_register_apply is changed",
    "skip_reason": "Conditional result was False"
}
  8303 1726773048.52930: no more pending results, returning what we have
  8303 1726773048.52932: results queue empty
  8303 1726773048.52933: checking for any_errors_fatal
  8303 1726773048.52936: done checking for any_errors_fatal
  8303 1726773048.52937: checking for max_fail_percentage
  8303 1726773048.52938: done checking for max_fail_percentage
  8303 1726773048.52938: checking to see if all hosts have failed and the running result is not ok
  8303 1726773048.52938: done checking to see if all hosts have failed
  8303 1726773048.52939: getting the remaining hosts for this loop
  8303 1726773048.52939: done getting the remaining hosts for this loop
  8303 1726773048.52942: getting the next task for host managed_node3
  8303 1726773048.52946: done getting next task for host managed_node3
  8303 1726773048.52948:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes
  8303 1726773048.52951:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773048.52960: getting variables
  8303 1726773048.52961: in VariableManager get_vars()
  8303 1726773048.52984: Calling all_inventory to load vars for managed_node3
  8303 1726773048.52987: Calling groups_inventory to load vars for managed_node3
  8303 1726773048.52989: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773048.52996: Calling all_plugins_play to load vars for managed_node3
  8303 1726773048.52997: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773048.52999: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773048.53035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773048.53067: done with get_vars()
  8303 1726773048.53073: done getting variables
  8303 1726773048.53124: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:177
Thursday 19 September 2024  15:10:48 -0400 (0:00:00.018)       0:00:25.110 **** 
  8303 1726773048.53158: entering _queue_task() for managed_node3/set_fact
  8303 1726773048.53348: worker is 1 (out of 1 available)
  8303 1726773048.53362: exiting _queue_task() for managed_node3/set_fact
  8303 1726773048.53375: done queuing things up, now waiting for results queue to drain
  8303 1726773048.53376: waiting for pending results...
  9449 1726773048.53600: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes
  9449 1726773048.53724: in run() - task 0affffe7-6841-6cfb-81ae-00000000020b
  9449 1726773048.53743: variable 'ansible_search_path' from source: unknown
  9449 1726773048.53747: variable 'ansible_search_path' from source: unknown
  9449 1726773048.53773: calling self._execute()
  9449 1726773048.53828: variable 'ansible_host' from source: host vars for 'managed_node3'
  9449 1726773048.53835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9449 1726773048.53841: variable 'omit' from source: magic vars
  9449 1726773048.53911: variable 'omit' from source: magic vars
  9449 1726773048.53953: variable 'omit' from source: magic vars
  9449 1726773048.53975: variable 'omit' from source: magic vars
  9449 1726773048.54007: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  9449 1726773048.54032: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  9449 1726773048.54048: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  9449 1726773048.54060: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9449 1726773048.54069: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9449 1726773048.54092: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  9449 1726773048.54096: variable 'ansible_host' from source: host vars for 'managed_node3'
  9449 1726773048.54098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9449 1726773048.54163: Set connection var ansible_pipelining to False
  9449 1726773048.54171: Set connection var ansible_timeout to 10
  9449 1726773048.54176: Set connection var ansible_module_compression to ZIP_DEFLATED
  9449 1726773048.54179: Set connection var ansible_shell_executable to /bin/sh
  9449 1726773048.54181: Set connection var ansible_connection to ssh
  9449 1726773048.54186: Set connection var ansible_shell_type to sh
  9449 1726773048.54200: variable 'ansible_shell_executable' from source: unknown
  9449 1726773048.54202: variable 'ansible_connection' from source: unknown
  9449 1726773048.54205: variable 'ansible_module_compression' from source: unknown
  9449 1726773048.54207: variable 'ansible_shell_type' from source: unknown
  9449 1726773048.54208: variable 'ansible_shell_executable' from source: unknown
  9449 1726773048.54210: variable 'ansible_host' from source: host vars for 'managed_node3'
  9449 1726773048.54212: variable 'ansible_pipelining' from source: unknown
  9449 1726773048.54213: variable 'ansible_timeout' from source: unknown
  9449 1726773048.54215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9449 1726773048.54317: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  9449 1726773048.54330: variable 'omit' from source: magic vars
  9449 1726773048.54336: starting attempt loop
  9449 1726773048.54340: running the handler
  9449 1726773048.54349: handler run complete
  9449 1726773048.54359: attempt loop complete, returning result
  9449 1726773048.54363: _execute() done
  9449 1726773048.54366: dumping result to json
  9449 1726773048.54369: done dumping result, returning
  9449 1726773048.54374: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes [0affffe7-6841-6cfb-81ae-00000000020b]
  9449 1726773048.54379: sending task result for task 0affffe7-6841-6cfb-81ae-00000000020b
  9449 1726773048.54397: done sending task result for task 0affffe7-6841-6cfb-81ae-00000000020b
  9449 1726773048.54399: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "ansible_facts": {
        "kernel_settings_reboot_required": false
    },
    "changed": false
}
  8303 1726773048.54621: no more pending results, returning what we have
  8303 1726773048.54623: results queue empty
  8303 1726773048.54623: checking for any_errors_fatal
  8303 1726773048.54627: done checking for any_errors_fatal
  8303 1726773048.54627: checking for max_fail_percentage
  8303 1726773048.54628: done checking for max_fail_percentage
  8303 1726773048.54628: checking to see if all hosts have failed and the running result is not ok
  8303 1726773048.54629: done checking to see if all hosts have failed
  8303 1726773048.54629: getting the remaining hosts for this loop
  8303 1726773048.54630: done getting the remaining hosts for this loop
  8303 1726773048.54632: getting the next task for host managed_node3
  8303 1726773048.54636: done getting next task for host managed_node3
  8303 1726773048.54638:  ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing
  8303 1726773048.54641:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773048.54646: getting variables
  8303 1726773048.54647: in VariableManager get_vars()
  8303 1726773048.54671: Calling all_inventory to load vars for managed_node3
  8303 1726773048.54673: Calling groups_inventory to load vars for managed_node3
  8303 1726773048.54674: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773048.54680: Calling all_plugins_play to load vars for managed_node3
  8303 1726773048.54682: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773048.54683: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773048.54719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773048.54749: done with get_vars()
  8303 1726773048.54755: done getting variables
  8303 1726773048.54795: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing] ***
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:181
Thursday 19 September 2024  15:10:48 -0400 (0:00:00.016)       0:00:25.126 **** 
  8303 1726773048.54818: entering _queue_task() for managed_node3/set_fact
  8303 1726773048.54976: worker is 1 (out of 1 available)
  8303 1726773048.54991: exiting _queue_task() for managed_node3/set_fact
  8303 1726773048.55003: done queuing things up, now waiting for results queue to drain
  8303 1726773048.55004: waiting for pending results...
  9450 1726773048.55113: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing
  9450 1726773048.55214: in run() - task 0affffe7-6841-6cfb-81ae-00000000020c
  9450 1726773048.55230: variable 'ansible_search_path' from source: unknown
  9450 1726773048.55234: variable 'ansible_search_path' from source: unknown
  9450 1726773048.55261: calling self._execute()
  9450 1726773048.55316: variable 'ansible_host' from source: host vars for 'managed_node3'
  9450 1726773048.55325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9450 1726773048.55334: variable 'omit' from source: magic vars
  9450 1726773048.55406: variable 'omit' from source: magic vars
  9450 1726773048.55443: variable 'omit' from source: magic vars
  9450 1726773048.55708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name
  9450 1726773048.55956: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py
  9450 1726773048.55993: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py
  9450 1726773048.56019: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py
  9450 1726773048.56046: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py
  9450 1726773048.56145: variable '__kernel_settings_register_profile' from source: set_fact
  9450 1726773048.56159: variable '__kernel_settings_register_mode' from source: set_fact
  9450 1726773048.56167: variable '__kernel_settings_register_apply' from source: set_fact
  9450 1726773048.56207: variable 'omit' from source: magic vars
  9450 1726773048.56230: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  9450 1726773048.56251: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  9450 1726773048.56268: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  9450 1726773048.56283: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9450 1726773048.56295: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9450 1726773048.56318: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  9450 1726773048.56322: variable 'ansible_host' from source: host vars for 'managed_node3'
  9450 1726773048.56326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9450 1726773048.56395: Set connection var ansible_pipelining to False
  9450 1726773048.56403: Set connection var ansible_timeout to 10
  9450 1726773048.56406: Set connection var ansible_module_compression to ZIP_DEFLATED
  9450 1726773048.56410: Set connection var ansible_shell_executable to /bin/sh
  9450 1726773048.56412: Set connection var ansible_connection to ssh
  9450 1726773048.56416: Set connection var ansible_shell_type to sh
  9450 1726773048.56430: variable 'ansible_shell_executable' from source: unknown
  9450 1726773048.56433: variable 'ansible_connection' from source: unknown
  9450 1726773048.56434: variable 'ansible_module_compression' from source: unknown
  9450 1726773048.56436: variable 'ansible_shell_type' from source: unknown
  9450 1726773048.56439: variable 'ansible_shell_executable' from source: unknown
  9450 1726773048.56441: variable 'ansible_host' from source: host vars for 'managed_node3'
  9450 1726773048.56443: variable 'ansible_pipelining' from source: unknown
  9450 1726773048.56444: variable 'ansible_timeout' from source: unknown
  9450 1726773048.56446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9450 1726773048.56512: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  9450 1726773048.56520: variable 'omit' from source: magic vars
  9450 1726773048.56524: starting attempt loop
  9450 1726773048.56526: running the handler
  9450 1726773048.56533: handler run complete
  9450 1726773048.56539: attempt loop complete, returning result
  9450 1726773048.56541: _execute() done
  9450 1726773048.56542: dumping result to json
  9450 1726773048.56545: done dumping result, returning
  9450 1726773048.56550: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing [0affffe7-6841-6cfb-81ae-00000000020c]
  9450 1726773048.56554: sending task result for task 0affffe7-6841-6cfb-81ae-00000000020c
  9450 1726773048.56571: done sending task result for task 0affffe7-6841-6cfb-81ae-00000000020c
  9450 1726773048.56573: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "ansible_facts": {
        "__kernel_settings_changed": false
    },
    "changed": false
}
  8303 1726773048.56819: no more pending results, returning what we have
  8303 1726773048.56821: results queue empty
  8303 1726773048.56821: checking for any_errors_fatal
  8303 1726773048.56824: done checking for any_errors_fatal
  8303 1726773048.56825: checking for max_fail_percentage
  8303 1726773048.56826: done checking for max_fail_percentage
  8303 1726773048.56826: checking to see if all hosts have failed and the running result is not ok
  8303 1726773048.56826: done checking to see if all hosts have failed
  8303 1726773048.56827: getting the remaining hosts for this loop
  8303 1726773048.56828: done getting the remaining hosts for this loop
  8303 1726773048.56830: getting the next task for host managed_node3
  8303 1726773048.56836: done getting next task for host managed_node3
  8303 1726773048.56838:  ^ task is: TASK: meta (role_complete)
  8303 1726773048.56840:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773048.56846: getting variables
  8303 1726773048.56847: in VariableManager get_vars()
  8303 1726773048.56871: Calling all_inventory to load vars for managed_node3
  8303 1726773048.56873: Calling groups_inventory to load vars for managed_node3
  8303 1726773048.56875: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773048.56881: Calling all_plugins_play to load vars for managed_node3
  8303 1726773048.56882: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773048.56884: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773048.56920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773048.56948: done with get_vars()
  8303 1726773048.56954: done getting variables
  8303 1726773048.57012: done queuing things up, now waiting for results queue to drain
  8303 1726773048.57017: results queue empty
  8303 1726773048.57017: checking for any_errors_fatal
  8303 1726773048.57020: done checking for any_errors_fatal
  8303 1726773048.57021: checking for max_fail_percentage
  8303 1726773048.57021: done checking for max_fail_percentage
  8303 1726773048.57022: checking to see if all hosts have failed and the running result is not ok
  8303 1726773048.57022: done checking to see if all hosts have failed
  8303 1726773048.57022: getting the remaining hosts for this loop
  8303 1726773048.57023: done getting the remaining hosts for this loop
  8303 1726773048.57024: getting the next task for host managed_node3
  8303 1726773048.57027: done getting next task for host managed_node3
  8303 1726773048.57028:  ^ task is: TASK: Verify no settings
  8303 1726773048.57029:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773048.57031: getting variables
  8303 1726773048.57031: in VariableManager get_vars()
  8303 1726773048.57039: Calling all_inventory to load vars for managed_node3
  8303 1726773048.57040: Calling groups_inventory to load vars for managed_node3
  8303 1726773048.57041: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773048.57044: Calling all_plugins_play to load vars for managed_node3
  8303 1726773048.57046: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773048.57047: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773048.57067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773048.57088: done with get_vars()
  8303 1726773048.57092: done getting variables
  8303 1726773048.57118: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [Verify no settings] ******************************************************
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:20
Thursday 19 September 2024  15:10:48 -0400 (0:00:00.023)       0:00:25.149 **** 
  8303 1726773048.57136: entering _queue_task() for managed_node3/shell
  8303 1726773048.57294: worker is 1 (out of 1 available)
  8303 1726773048.57308: exiting _queue_task() for managed_node3/shell
  8303 1726773048.57321: done queuing things up, now waiting for results queue to drain
  8303 1726773048.57322: waiting for pending results...
  9451 1726773048.57434: running TaskExecutor() for managed_node3/TASK: Verify no settings
  9451 1726773048.57530: in run() - task 0affffe7-6841-6cfb-81ae-000000000154
  9451 1726773048.57545: variable 'ansible_search_path' from source: unknown
  9451 1726773048.57549: variable 'ansible_search_path' from source: unknown
  9451 1726773048.57578: calling self._execute()
  9451 1726773048.57727: variable 'ansible_host' from source: host vars for 'managed_node3'
  9451 1726773048.57735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9451 1726773048.57744: variable 'omit' from source: magic vars
  9451 1726773048.57816: variable 'omit' from source: magic vars
  9451 1726773048.57845: variable 'omit' from source: magic vars
  9451 1726773048.58086: variable '__kernel_settings_profile_filename' from source: role '' exported vars
  9451 1726773048.58141: variable '__kernel_settings_profile_dir' from source: role '' exported vars
  9451 1726773048.58204: variable '__kernel_settings_profile_parent' from source: set_fact
  9451 1726773048.58212: variable '__kernel_settings_tuned_profile' from source: role '' exported vars
  9451 1726773048.58246: variable 'omit' from source: magic vars
  9451 1726773048.58280: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  9451 1726773048.58308: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  9451 1726773048.58325: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  9451 1726773048.58339: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9451 1726773048.58351: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9451 1726773048.58376: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  9451 1726773048.58382: variable 'ansible_host' from source: host vars for 'managed_node3'
  9451 1726773048.58387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9451 1726773048.58454: Set connection var ansible_pipelining to False
  9451 1726773048.58464: Set connection var ansible_timeout to 10
  9451 1726773048.58472: Set connection var ansible_module_compression to ZIP_DEFLATED
  9451 1726773048.58477: Set connection var ansible_shell_executable to /bin/sh
  9451 1726773048.58480: Set connection var ansible_connection to ssh
  9451 1726773048.58484: Set connection var ansible_shell_type to sh
  9451 1726773048.58502: variable 'ansible_shell_executable' from source: unknown
  9451 1726773048.58506: variable 'ansible_connection' from source: unknown
  9451 1726773048.58510: variable 'ansible_module_compression' from source: unknown
  9451 1726773048.58513: variable 'ansible_shell_type' from source: unknown
  9451 1726773048.58516: variable 'ansible_shell_executable' from source: unknown
  9451 1726773048.58521: variable 'ansible_host' from source: host vars for 'managed_node3'
  9451 1726773048.58525: variable 'ansible_pipelining' from source: unknown
  9451 1726773048.58528: variable 'ansible_timeout' from source: unknown
  9451 1726773048.58532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9451 1726773048.58624: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  9451 1726773048.58636: variable 'omit' from source: magic vars
  9451 1726773048.58642: starting attempt loop
  9451 1726773048.58645: running the handler
  9451 1726773048.58653: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  9451 1726773048.58668: _low_level_execute_command(): starting
  9451 1726773048.58677: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  9451 1726773048.61116: stdout chunk (state=2):
>>>/root
<<<
  9451 1726773048.61238: stderr chunk (state=3):
>>><<<
  9451 1726773048.61246: stdout chunk (state=3):
>>><<<
  9451 1726773048.61265: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  9451 1726773048.61282: _low_level_execute_command(): starting
  9451 1726773048.61290: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773048.6127608-9451-210148793500449 `" && echo ansible-tmp-1726773048.6127608-9451-210148793500449="` echo /root/.ansible/tmp/ansible-tmp-1726773048.6127608-9451-210148793500449 `" ) && sleep 0'
  9451 1726773048.63895: stdout chunk (state=2):
>>>ansible-tmp-1726773048.6127608-9451-210148793500449=/root/.ansible/tmp/ansible-tmp-1726773048.6127608-9451-210148793500449
<<<
  9451 1726773048.64061: stderr chunk (state=3):
>>><<<
  9451 1726773048.64072: stdout chunk (state=3):
>>><<<
  9451 1726773048.64093: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773048.6127608-9451-210148793500449=/root/.ansible/tmp/ansible-tmp-1726773048.6127608-9451-210148793500449
, stderr=
  9451 1726773048.64123: variable 'ansible_module_compression' from source: unknown
  9451 1726773048.64176: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED
  9451 1726773048.64215: variable 'ansible_facts' from source: unknown
  9451 1726773048.64312: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773048.6127608-9451-210148793500449/AnsiballZ_command.py
  9451 1726773048.64430: Sending initial data
  9451 1726773048.64437: Sent initial data (154 bytes)
  9451 1726773048.67340: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmp8w_o3zsf /root/.ansible/tmp/ansible-tmp-1726773048.6127608-9451-210148793500449/AnsiballZ_command.py
<<<
  9451 1726773048.68535: stderr chunk (state=3):
>>><<<
  9451 1726773048.68545: stdout chunk (state=3):
>>><<<
  9451 1726773048.68565: done transferring module to remote
  9451 1726773048.68578: _low_level_execute_command(): starting
  9451 1726773048.68583: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773048.6127608-9451-210148793500449/ /root/.ansible/tmp/ansible-tmp-1726773048.6127608-9451-210148793500449/AnsiballZ_command.py && sleep 0'
  9451 1726773048.71069: stderr chunk (state=2):
>>><<<
  9451 1726773048.71078: stdout chunk (state=2):
>>><<<
  9451 1726773048.71094: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9451 1726773048.71098: _low_level_execute_command(): starting
  9451 1726773048.71104: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773048.6127608-9451-210148793500449/AnsiballZ_command.py && sleep 0'
  9451 1726773048.87095: stdout chunk (state=2):
>>>
{"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ conf=/etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysctl\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysfs\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[systemd\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[vm\\]' /etc/tuned/kernel_settings/tuned.conf\n+ exit 0", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n  if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n    echo ERROR: \"$section\" settings present\n    rc=1\n  fi\ndone\nexit \"$rc\"\n", "start": "2024-09-19 15:10:48.860626", "end": "2024-09-19 15:10:48.867998", "delta": "0:00:00.007372", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n  if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n    echo ERROR: \"$section\" settings present\n    rc=1\n  fi\ndone\nexit \"$rc\"\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}
<<<
  9451 1726773048.88360: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  9451 1726773048.88374: stdout chunk (state=3):
>>><<<
  9451 1726773048.88388: stderr chunk (state=3):
>>><<<
  9451 1726773048.88406: _low_level_execute_command() done: rc=0, stdout=
{"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ conf=/etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysctl\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysfs\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[systemd\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[vm\\]' /etc/tuned/kernel_settings/tuned.conf\n+ exit 0", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n  if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n    echo ERROR: \"$section\" settings present\n    rc=1\n  fi\ndone\nexit \"$rc\"\n", "start": "2024-09-19 15:10:48.860626", "end": "2024-09-19 15:10:48.867998", "delta": "0:00:00.007372", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n  if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n    echo ERROR: \"$section\" settings present\n    rc=1\n  fi\ndone\nexit \"$rc\"\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}
, stderr=Shared connection to 10.31.47.99 closed.
  9451 1726773048.88449: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n  if grep ^\\\\["$section"\\\\] "$conf"; then\n    echo ERROR: "$section" settings present\n    rc=1\n  fi\ndone\nexit "$rc"\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773048.6127608-9451-210148793500449/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  9451 1726773048.88459: _low_level_execute_command(): starting
  9451 1726773048.88465: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773048.6127608-9451-210148793500449/ > /dev/null 2>&1 && sleep 0'
  9451 1726773048.91201: stderr chunk (state=2):
>>><<<
  9451 1726773048.91213: stdout chunk (state=2):
>>><<<
  9451 1726773048.91232: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9451 1726773048.91239: handler run complete
  9451 1726773048.91260: Evaluated conditional (False): False
  9451 1726773048.91273: attempt loop complete, returning result
  9451 1726773048.91277: _execute() done
  9451 1726773048.91280: dumping result to json
  9451 1726773048.91287: done dumping result, returning
  9451 1726773048.91295: done running TaskExecutor() for managed_node3/TASK: Verify no settings [0affffe7-6841-6cfb-81ae-000000000154]
  9451 1726773048.91302: sending task result for task 0affffe7-6841-6cfb-81ae-000000000154
  9451 1726773048.91343: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000154
  9451 1726773048.91347: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "changed": false,
    "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n  if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n    echo ERROR: \"$section\" settings present\n    rc=1\n  fi\ndone\nexit \"$rc\"\n",
    "delta": "0:00:00.007372",
    "end": "2024-09-19 15:10:48.867998",
    "rc": 0,
    "start": "2024-09-19 15:10:48.860626"
}

STDERR:

+ exec
+ rc=0
+ conf=/etc/tuned/kernel_settings/tuned.conf
+ for section in sysctl sysfs systemd vm
+ grep '^\[sysctl\]' /etc/tuned/kernel_settings/tuned.conf
+ for section in sysctl sysfs systemd vm
+ grep '^\[sysfs\]' /etc/tuned/kernel_settings/tuned.conf
+ for section in sysctl sysfs systemd vm
+ grep '^\[systemd\]' /etc/tuned/kernel_settings/tuned.conf
+ for section in sysctl sysfs systemd vm
+ grep '^\[vm\]' /etc/tuned/kernel_settings/tuned.conf
+ exit 0
  8303 1726773048.91862: no more pending results, returning what we have
  8303 1726773048.91865: results queue empty
  8303 1726773048.91866: checking for any_errors_fatal
  8303 1726773048.91867: done checking for any_errors_fatal
  8303 1726773048.91868: checking for max_fail_percentage
  8303 1726773048.91869: done checking for max_fail_percentage
  8303 1726773048.91870: checking to see if all hosts have failed and the running result is not ok
  8303 1726773048.91870: done checking to see if all hosts have failed
  8303 1726773048.91871: getting the remaining hosts for this loop
  8303 1726773048.91872: done getting the remaining hosts for this loop
  8303 1726773048.91875: getting the next task for host managed_node3
  8303 1726773048.91880: done getting next task for host managed_node3
  8303 1726773048.91883:  ^ task is: TASK: Remove kernel_settings tuned profile
  8303 1726773048.91886:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773048.91889: getting variables
  8303 1726773048.91891: in VariableManager get_vars()
  8303 1726773048.91916: Calling all_inventory to load vars for managed_node3
  8303 1726773048.91919: Calling groups_inventory to load vars for managed_node3
  8303 1726773048.91921: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773048.91929: Calling all_plugins_play to load vars for managed_node3
  8303 1726773048.91936: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773048.91940: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773048.91990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773048.92023: done with get_vars()
  8303 1726773048.92030: done getting variables

TASK [Remove kernel_settings tuned profile] ************************************
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:36
Thursday 19 September 2024  15:10:48 -0400 (0:00:00.349)       0:00:25.499 **** 
  8303 1726773048.92116: entering _queue_task() for managed_node3/file
  8303 1726773048.92318: worker is 1 (out of 1 available)
  8303 1726773048.92333: exiting _queue_task() for managed_node3/file
  8303 1726773048.92346: done queuing things up, now waiting for results queue to drain
  8303 1726773048.92347: waiting for pending results...
  9471 1726773048.92596: running TaskExecutor() for managed_node3/TASK: Remove kernel_settings tuned profile
  9471 1726773048.92721: in run() - task 0affffe7-6841-6cfb-81ae-000000000155
  9471 1726773048.92739: variable 'ansible_search_path' from source: unknown
  9471 1726773048.92743: variable 'ansible_search_path' from source: unknown
  9471 1726773048.92778: calling self._execute()
  9471 1726773048.92850: variable 'ansible_host' from source: host vars for 'managed_node3'
  9471 1726773048.92860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9471 1726773048.92869: variable 'omit' from source: magic vars
  9471 1726773048.92972: variable 'omit' from source: magic vars
  9471 1726773048.93014: variable 'omit' from source: magic vars
  9471 1726773048.93041: variable '__kernel_settings_profile_dir' from source: role '' exported vars
  9471 1726773048.93329: variable '__kernel_settings_profile_dir' from source: role '' exported vars
  9471 1726773048.93423: variable '__kernel_settings_profile_parent' from source: set_fact
  9471 1726773048.93432: variable '__kernel_settings_tuned_profile' from source: role '' exported vars
  9471 1726773048.93471: variable 'omit' from source: magic vars
  9471 1726773048.93572: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  9471 1726773048.93611: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  9471 1726773048.93631: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  9471 1726773048.93647: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9471 1726773048.93658: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9471 1726773048.93687: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  9471 1726773048.93693: variable 'ansible_host' from source: host vars for 'managed_node3'
  9471 1726773048.93697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9471 1726773048.93792: Set connection var ansible_pipelining to False
  9471 1726773048.93803: Set connection var ansible_timeout to 10
  9471 1726773048.93810: Set connection var ansible_module_compression to ZIP_DEFLATED
  9471 1726773048.93816: Set connection var ansible_shell_executable to /bin/sh
  9471 1726773048.93820: Set connection var ansible_connection to ssh
  9471 1726773048.93827: Set connection var ansible_shell_type to sh
  9471 1726773048.93848: variable 'ansible_shell_executable' from source: unknown
  9471 1726773048.93853: variable 'ansible_connection' from source: unknown
  9471 1726773048.93857: variable 'ansible_module_compression' from source: unknown
  9471 1726773048.93860: variable 'ansible_shell_type' from source: unknown
  9471 1726773048.93863: variable 'ansible_shell_executable' from source: unknown
  9471 1726773048.93866: variable 'ansible_host' from source: host vars for 'managed_node3'
  9471 1726773048.93870: variable 'ansible_pipelining' from source: unknown
  9471 1726773048.93873: variable 'ansible_timeout' from source: unknown
  9471 1726773048.93877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9471 1726773048.94071: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action)
  9471 1726773048.94083: variable 'omit' from source: magic vars
  9471 1726773048.94092: starting attempt loop
  9471 1726773048.94095: running the handler
  9471 1726773048.94108: _low_level_execute_command(): starting
  9471 1726773048.94115: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  9471 1726773048.96823: stdout chunk (state=2):
>>>/root
<<<
  9471 1726773048.96968: stderr chunk (state=3):
>>><<<
  9471 1726773048.96978: stdout chunk (state=3):
>>><<<
  9471 1726773048.97003: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  9471 1726773048.97019: _low_level_execute_command(): starting
  9471 1726773048.97026: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773048.9701188-9471-201782053265629 `" && echo ansible-tmp-1726773048.9701188-9471-201782053265629="` echo /root/.ansible/tmp/ansible-tmp-1726773048.9701188-9471-201782053265629 `" ) && sleep 0'
  9471 1726773049.00018: stdout chunk (state=2):
>>>ansible-tmp-1726773048.9701188-9471-201782053265629=/root/.ansible/tmp/ansible-tmp-1726773048.9701188-9471-201782053265629
<<<
  9471 1726773049.00363: stderr chunk (state=3):
>>><<<
  9471 1726773049.00377: stdout chunk (state=3):
>>><<<
  9471 1726773049.00399: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773048.9701188-9471-201782053265629=/root/.ansible/tmp/ansible-tmp-1726773048.9701188-9471-201782053265629
, stderr=
  9471 1726773049.00447: variable 'ansible_module_compression' from source: unknown
  9471 1726773049.00510: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED
  9471 1726773049.00549: variable 'ansible_facts' from source: unknown
  9471 1726773049.00653: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773048.9701188-9471-201782053265629/AnsiballZ_file.py
  9471 1726773049.01166: Sending initial data
  9471 1726773049.01177: Sent initial data (151 bytes)
  9471 1726773049.04287: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpvufjej99 /root/.ansible/tmp/ansible-tmp-1726773048.9701188-9471-201782053265629/AnsiballZ_file.py
<<<
  9471 1726773049.05930: stderr chunk (state=3):
>>><<<
  9471 1726773049.05944: stdout chunk (state=3):
>>><<<
  9471 1726773049.05974: done transferring module to remote
  9471 1726773049.05990: _low_level_execute_command(): starting
  9471 1726773049.05997: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773048.9701188-9471-201782053265629/ /root/.ansible/tmp/ansible-tmp-1726773048.9701188-9471-201782053265629/AnsiballZ_file.py && sleep 0'
  9471 1726773049.08911: stderr chunk (state=2):
>>><<<
  9471 1726773049.08924: stdout chunk (state=2):
>>><<<
  9471 1726773049.08942: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9471 1726773049.08947: _low_level_execute_command(): starting
  9471 1726773049.08953: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773048.9701188-9471-201782053265629/AnsiballZ_file.py && sleep 0'
  9471 1726773049.25024: stdout chunk (state=2):
>>>
{"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "directory", "path_content": {"directories": [], "files": ["/etc/tuned/kernel_settings/tuned.conf"]}}, "after": {"path": "/etc/tuned/kernel_settings", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "absent", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}}
<<<
  9471 1726773049.26197: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  9471 1726773049.26209: stdout chunk (state=3):
>>><<<
  9471 1726773049.26221: stderr chunk (state=3):
>>><<<
  9471 1726773049.26234: _low_level_execute_command() done: rc=0, stdout=
{"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "directory", "path_content": {"directories": [], "files": ["/etc/tuned/kernel_settings/tuned.conf"]}}, "after": {"path": "/etc/tuned/kernel_settings", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "absent", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}}
, stderr=Shared connection to 10.31.47.99 closed.
  9471 1726773049.26269: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'absent', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773048.9701188-9471-201782053265629/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  9471 1726773049.26281: _low_level_execute_command(): starting
  9471 1726773049.26289: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773048.9701188-9471-201782053265629/ > /dev/null 2>&1 && sleep 0'
  9471 1726773049.28842: stderr chunk (state=2):
>>><<<
  9471 1726773049.28854: stdout chunk (state=2):
>>><<<
  9471 1726773049.28872: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9471 1726773049.28880: handler run complete
  9471 1726773049.28901: attempt loop complete, returning result
  9471 1726773049.28905: _execute() done
  9471 1726773049.28908: dumping result to json
  9471 1726773049.28914: done dumping result, returning
  9471 1726773049.28922: done running TaskExecutor() for managed_node3/TASK: Remove kernel_settings tuned profile [0affffe7-6841-6cfb-81ae-000000000155]
  9471 1726773049.28930: sending task result for task 0affffe7-6841-6cfb-81ae-000000000155
  9471 1726773049.28962: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000155
  9471 1726773049.28965: WORKER PROCESS EXITING
changed: [managed_node3] => {
    "changed": true,
    "path": "/etc/tuned/kernel_settings",
    "state": "absent"
}
  8303 1726773049.29110: no more pending results, returning what we have
  8303 1726773049.29112: results queue empty
  8303 1726773049.29113: checking for any_errors_fatal
  8303 1726773049.29124: done checking for any_errors_fatal
  8303 1726773049.29124: checking for max_fail_percentage
  8303 1726773049.29126: done checking for max_fail_percentage
  8303 1726773049.29126: checking to see if all hosts have failed and the running result is not ok
  8303 1726773049.29127: done checking to see if all hosts have failed
  8303 1726773049.29127: getting the remaining hosts for this loop
  8303 1726773049.29128: done getting the remaining hosts for this loop
  8303 1726773049.29132: getting the next task for host managed_node3
  8303 1726773049.29137: done getting next task for host managed_node3
  8303 1726773049.29140:  ^ task is: TASK: Get active_profile
  8303 1726773049.29142:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773049.29145: getting variables
  8303 1726773049.29146: in VariableManager get_vars()
  8303 1726773049.29179: Calling all_inventory to load vars for managed_node3
  8303 1726773049.29182: Calling groups_inventory to load vars for managed_node3
  8303 1726773049.29184: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773049.29195: Calling all_plugins_play to load vars for managed_node3
  8303 1726773049.29197: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773049.29200: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773049.29242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773049.29270: done with get_vars()
  8303 1726773049.29276: done getting variables

TASK [Get active_profile] ******************************************************
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:41
Thursday 19 September 2024  15:10:49 -0400 (0:00:00.372)       0:00:25.872 **** 
  8303 1726773049.29344: entering _queue_task() for managed_node3/slurp
  8303 1726773049.29530: worker is 1 (out of 1 available)
  8303 1726773049.29545: exiting _queue_task() for managed_node3/slurp
  8303 1726773049.29558: done queuing things up, now waiting for results queue to drain
  8303 1726773049.29559: waiting for pending results...
  9483 1726773049.29674: running TaskExecutor() for managed_node3/TASK: Get active_profile
  9483 1726773049.29775: in run() - task 0affffe7-6841-6cfb-81ae-000000000156
  9483 1726773049.29791: variable 'ansible_search_path' from source: unknown
  9483 1726773049.29796: variable 'ansible_search_path' from source: unknown
  9483 1726773049.29825: calling self._execute()
  9483 1726773049.29878: variable 'ansible_host' from source: host vars for 'managed_node3'
  9483 1726773049.29888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9483 1726773049.29895: variable 'omit' from source: magic vars
  9483 1726773049.29971: variable 'omit' from source: magic vars
  9483 1726773049.30009: variable 'omit' from source: magic vars
  9483 1726773049.30032: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars
  9483 1726773049.30266: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars
  9483 1726773049.30330: variable '__kernel_settings_tuned_dir' from source: role '' exported vars
  9483 1726773049.30364: variable 'omit' from source: magic vars
  9483 1726773049.30400: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  9483 1726773049.30426: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  9483 1726773049.30445: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  9483 1726773049.30461: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9483 1726773049.30473: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9483 1726773049.30499: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  9483 1726773049.30504: variable 'ansible_host' from source: host vars for 'managed_node3'
  9483 1726773049.30508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9483 1726773049.30578: Set connection var ansible_pipelining to False
  9483 1726773049.30590: Set connection var ansible_timeout to 10
  9483 1726773049.30596: Set connection var ansible_module_compression to ZIP_DEFLATED
  9483 1726773049.30602: Set connection var ansible_shell_executable to /bin/sh
  9483 1726773049.30605: Set connection var ansible_connection to ssh
  9483 1726773049.30612: Set connection var ansible_shell_type to sh
  9483 1726773049.30630: variable 'ansible_shell_executable' from source: unknown
  9483 1726773049.30634: variable 'ansible_connection' from source: unknown
  9483 1726773049.30637: variable 'ansible_module_compression' from source: unknown
  9483 1726773049.30640: variable 'ansible_shell_type' from source: unknown
  9483 1726773049.30644: variable 'ansible_shell_executable' from source: unknown
  9483 1726773049.30647: variable 'ansible_host' from source: host vars for 'managed_node3'
  9483 1726773049.30652: variable 'ansible_pipelining' from source: unknown
  9483 1726773049.30655: variable 'ansible_timeout' from source: unknown
  9483 1726773049.30659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9483 1726773049.30804: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action)
  9483 1726773049.30814: variable 'omit' from source: magic vars
  9483 1726773049.30820: starting attempt loop
  9483 1726773049.30824: running the handler
  9483 1726773049.30834: _low_level_execute_command(): starting
  9483 1726773049.30842: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  9483 1726773049.33223: stdout chunk (state=2):
>>>/root
<<<
  9483 1726773049.33347: stderr chunk (state=3):
>>><<<
  9483 1726773049.33356: stdout chunk (state=3):
>>><<<
  9483 1726773049.33377: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  9483 1726773049.33394: _low_level_execute_command(): starting
  9483 1726773049.33402: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773049.333876-9483-75569024844580 `" && echo ansible-tmp-1726773049.333876-9483-75569024844580="` echo /root/.ansible/tmp/ansible-tmp-1726773049.333876-9483-75569024844580 `" ) && sleep 0'
  9483 1726773049.35991: stdout chunk (state=2):
>>>ansible-tmp-1726773049.333876-9483-75569024844580=/root/.ansible/tmp/ansible-tmp-1726773049.333876-9483-75569024844580
<<<
  9483 1726773049.36128: stderr chunk (state=3):
>>><<<
  9483 1726773049.36137: stdout chunk (state=3):
>>><<<
  9483 1726773049.36155: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773049.333876-9483-75569024844580=/root/.ansible/tmp/ansible-tmp-1726773049.333876-9483-75569024844580
, stderr=
  9483 1726773049.36197: variable 'ansible_module_compression' from source: unknown
  9483 1726773049.36234: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.slurp-ZIP_DEFLATED
  9483 1726773049.36267: variable 'ansible_facts' from source: unknown
  9483 1726773049.36345: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773049.333876-9483-75569024844580/AnsiballZ_slurp.py
  9483 1726773049.36455: Sending initial data
  9483 1726773049.36463: Sent initial data (150 bytes)
  9483 1726773049.39064: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpu_ofs6mv /root/.ansible/tmp/ansible-tmp-1726773049.333876-9483-75569024844580/AnsiballZ_slurp.py
<<<
  9483 1726773049.40246: stderr chunk (state=3):
>>><<<
  9483 1726773049.40258: stdout chunk (state=3):
>>><<<
  9483 1726773049.40282: done transferring module to remote
  9483 1726773049.40297: _low_level_execute_command(): starting
  9483 1726773049.40303: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773049.333876-9483-75569024844580/ /root/.ansible/tmp/ansible-tmp-1726773049.333876-9483-75569024844580/AnsiballZ_slurp.py && sleep 0'
  9483 1726773049.42812: stderr chunk (state=2):
>>><<<
  9483 1726773049.42824: stdout chunk (state=2):
>>><<<
  9483 1726773049.42840: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9483 1726773049.42845: _low_level_execute_command(): starting
  9483 1726773049.42851: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773049.333876-9483-75569024844580/AnsiballZ_slurp.py && sleep 0'
  9483 1726773049.58173: stdout chunk (state=2):
>>>
{"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}}
<<<
  9483 1726773049.59363: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  9483 1726773049.59375: stdout chunk (state=3):
>>><<<
  9483 1726773049.59387: stderr chunk (state=3):
>>><<<
  9483 1726773049.59401: _low_level_execute_command() done: rc=0, stdout=
{"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}}
, stderr=Shared connection to 10.31.47.99 closed.
  9483 1726773049.59428: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773049.333876-9483-75569024844580/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  9483 1726773049.59440: _low_level_execute_command(): starting
  9483 1726773049.59446: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773049.333876-9483-75569024844580/ > /dev/null 2>&1 && sleep 0'
  9483 1726773049.62211: stderr chunk (state=2):
>>><<<
  9483 1726773049.62223: stdout chunk (state=2):
>>><<<
  9483 1726773049.62240: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9483 1726773049.62247: handler run complete
  9483 1726773049.62265: attempt loop complete, returning result
  9483 1726773049.62270: _execute() done
  9483 1726773049.62273: dumping result to json
  9483 1726773049.62277: done dumping result, returning
  9483 1726773049.62284: done running TaskExecutor() for managed_node3/TASK: Get active_profile [0affffe7-6841-6cfb-81ae-000000000156]
  9483 1726773049.62292: sending task result for task 0affffe7-6841-6cfb-81ae-000000000156
  9483 1726773049.62326: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000156
  9483 1726773049.62330: WORKER PROCESS EXITING
ok: [managed_node3] => {
    "changed": false,
    "content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK",
    "encoding": "base64",
    "source": "/etc/tuned/active_profile"
}
  8303 1726773049.62716: no more pending results, returning what we have
  8303 1726773049.62720: results queue empty
  8303 1726773049.62720: checking for any_errors_fatal
  8303 1726773049.62728: done checking for any_errors_fatal
  8303 1726773049.62728: checking for max_fail_percentage
  8303 1726773049.62730: done checking for max_fail_percentage
  8303 1726773049.62730: checking to see if all hosts have failed and the running result is not ok
  8303 1726773049.62731: done checking to see if all hosts have failed
  8303 1726773049.62732: getting the remaining hosts for this loop
  8303 1726773049.62733: done getting the remaining hosts for this loop
  8303 1726773049.62736: getting the next task for host managed_node3
  8303 1726773049.62742: done getting next task for host managed_node3
  8303 1726773049.62744:  ^ task is: TASK: Ensure kernel_settings is not in active_profile
  8303 1726773049.62747:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773049.62750: getting variables
  8303 1726773049.62751: in VariableManager get_vars()
  8303 1726773049.62781: Calling all_inventory to load vars for managed_node3
  8303 1726773049.62784: Calling groups_inventory to load vars for managed_node3
  8303 1726773049.62788: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773049.62797: Calling all_plugins_play to load vars for managed_node3
  8303 1726773049.62799: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773049.62802: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773049.62849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773049.62888: done with get_vars()
  8303 1726773049.62896: done getting variables
  8303 1726773049.62949: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [Ensure kernel_settings is not in active_profile] *************************
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:46
Thursday 19 September 2024  15:10:49 -0400 (0:00:00.336)       0:00:26.208 **** 
  8303 1726773049.62978: entering _queue_task() for managed_node3/copy
  8303 1726773049.63181: worker is 1 (out of 1 available)
  8303 1726773049.63197: exiting _queue_task() for managed_node3/copy
  8303 1726773049.63207: done queuing things up, now waiting for results queue to drain
  8303 1726773049.63208: waiting for pending results...
  9495 1726773049.63495: running TaskExecutor() for managed_node3/TASK: Ensure kernel_settings is not in active_profile
  9495 1726773049.63612: in run() - task 0affffe7-6841-6cfb-81ae-000000000157
  9495 1726773049.63628: variable 'ansible_search_path' from source: unknown
  9495 1726773049.63632: variable 'ansible_search_path' from source: unknown
  9495 1726773049.63664: calling self._execute()
  9495 1726773049.63737: variable 'ansible_host' from source: host vars for 'managed_node3'
  9495 1726773049.63747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9495 1726773049.63757: variable 'omit' from source: magic vars
  9495 1726773049.63854: variable 'omit' from source: magic vars
  9495 1726773049.63896: variable 'omit' from source: magic vars
  9495 1726773049.63923: variable '__active_profile' from source: task vars
  9495 1726773049.64203: variable '__active_profile' from source: task vars
  9495 1726773049.64472: variable '__cur_profile' from source: task vars
  9495 1726773049.64625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name
  9495 1726773049.66985: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py
  9495 1726773049.67065: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py
  9495 1726773049.67111: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py
  9495 1726773049.67155: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py
  9495 1726773049.67187: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py
  9495 1726773049.67968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False)
  9495 1726773049.68017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False)
  9495 1726773049.68043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False)
  9495 1726773049.68083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False)
  9495 1726773049.68098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False)
  9495 1726773049.68203: variable '__kernel_settings_tuned_current_profile' from source: set_fact
  9495 1726773049.68255: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars
  9495 1726773049.68325: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars
  9495 1726773049.68396: variable '__kernel_settings_tuned_dir' from source: role '' exported vars
  9495 1726773049.68422: variable 'omit' from source: magic vars
  9495 1726773049.68446: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  9495 1726773049.68471: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  9495 1726773049.68490: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  9495 1726773049.68507: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9495 1726773049.68518: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9495 1726773049.68545: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  9495 1726773049.68550: variable 'ansible_host' from source: host vars for 'managed_node3'
  9495 1726773049.68554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9495 1726773049.68647: Set connection var ansible_pipelining to False
  9495 1726773049.68658: Set connection var ansible_timeout to 10
  9495 1726773049.68664: Set connection var ansible_module_compression to ZIP_DEFLATED
  9495 1726773049.68669: Set connection var ansible_shell_executable to /bin/sh
  9495 1726773049.68673: Set connection var ansible_connection to ssh
  9495 1726773049.68680: Set connection var ansible_shell_type to sh
  9495 1726773049.68704: variable 'ansible_shell_executable' from source: unknown
  9495 1726773049.68709: variable 'ansible_connection' from source: unknown
  9495 1726773049.68712: variable 'ansible_module_compression' from source: unknown
  9495 1726773049.68715: variable 'ansible_shell_type' from source: unknown
  9495 1726773049.68717: variable 'ansible_shell_executable' from source: unknown
  9495 1726773049.68719: variable 'ansible_host' from source: host vars for 'managed_node3'
  9495 1726773049.68723: variable 'ansible_pipelining' from source: unknown
  9495 1726773049.68726: variable 'ansible_timeout' from source: unknown
  9495 1726773049.68729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9495 1726773049.68813: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  9495 1726773049.68825: variable 'omit' from source: magic vars
  9495 1726773049.68830: starting attempt loop
  9495 1726773049.68833: running the handler
  9495 1726773049.68842: _low_level_execute_command(): starting
  9495 1726773049.68848: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  9495 1726773049.71651: stdout chunk (state=2):
>>>/root
<<<
  9495 1726773049.71786: stderr chunk (state=3):
>>><<<
  9495 1726773049.71796: stdout chunk (state=3):
>>><<<
  9495 1726773049.71820: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  9495 1726773049.71834: _low_level_execute_command(): starting
  9495 1726773049.71841: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773049.7182913-9495-249297695255348 `" && echo ansible-tmp-1726773049.7182913-9495-249297695255348="` echo /root/.ansible/tmp/ansible-tmp-1726773049.7182913-9495-249297695255348 `" ) && sleep 0'
  9495 1726773049.74658: stdout chunk (state=2):
>>>ansible-tmp-1726773049.7182913-9495-249297695255348=/root/.ansible/tmp/ansible-tmp-1726773049.7182913-9495-249297695255348
<<<
  9495 1726773049.74790: stderr chunk (state=3):
>>><<<
  9495 1726773049.74796: stdout chunk (state=3):
>>><<<
  9495 1726773049.74817: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773049.7182913-9495-249297695255348=/root/.ansible/tmp/ansible-tmp-1726773049.7182913-9495-249297695255348
, stderr=
  9495 1726773049.74879: variable 'ansible_module_compression' from source: unknown
  9495 1726773049.74923: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED
  9495 1726773049.74950: variable 'ansible_facts' from source: unknown
  9495 1726773049.75024: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773049.7182913-9495-249297695255348/AnsiballZ_stat.py
  9495 1726773049.75113: Sending initial data
  9495 1726773049.75120: Sent initial data (151 bytes)
  9495 1726773049.77790: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpldpkrc0d /root/.ansible/tmp/ansible-tmp-1726773049.7182913-9495-249297695255348/AnsiballZ_stat.py
<<<
  9495 1726773049.79269: stderr chunk (state=3):
>>><<<
  9495 1726773049.79279: stdout chunk (state=3):
>>><<<
  9495 1726773049.79299: done transferring module to remote
  9495 1726773049.79310: _low_level_execute_command(): starting
  9495 1726773049.79315: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773049.7182913-9495-249297695255348/ /root/.ansible/tmp/ansible-tmp-1726773049.7182913-9495-249297695255348/AnsiballZ_stat.py && sleep 0'
  9495 1726773049.81778: stderr chunk (state=2):
>>><<<
  9495 1726773049.81789: stdout chunk (state=2):
>>><<<
  9495 1726773049.81804: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9495 1726773049.81808: _low_level_execute_command(): starting
  9495 1726773049.81813: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773049.7182913-9495-249297695255348/AnsiballZ_stat.py && sleep 0'
  9495 1726773049.98293: stdout chunk (state=2):
>>>
{"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 499122370, "dev": 51713, "nlink": 1, "atime": 1726773046.1880348, "mtime": 1726773038.3080044, "ctime": 1726773038.3080044, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "1754931174", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}}
<<<
  9495 1726773049.99480: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  9495 1726773049.99524: stderr chunk (state=3):
>>><<<
  9495 1726773049.99531: stdout chunk (state=3):
>>><<<
  9495 1726773049.99547: _low_level_execute_command() done: rc=0, stdout=
{"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 499122370, "dev": 51713, "nlink": 1, "atime": 1726773046.1880348, "mtime": 1726773038.3080044, "ctime": 1726773038.3080044, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "1754931174", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}}
, stderr=Shared connection to 10.31.47.99 closed.
  9495 1726773049.99597: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773049.7182913-9495-249297695255348/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  9495 1726773049.99684: Sending initial data
  9495 1726773049.99693: Sent initial data (140 bytes)
  9495 1726773050.02585: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpj85rtjyo /root/.ansible/tmp/ansible-tmp-1726773049.7182913-9495-249297695255348/source
<<<
  9495 1726773050.03425: stderr chunk (state=3):
>>><<<
  9495 1726773050.03433: stdout chunk (state=3):
>>><<<
  9495 1726773050.03452: _low_level_execute_command(): starting
  9495 1726773050.03458: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773049.7182913-9495-249297695255348/ /root/.ansible/tmp/ansible-tmp-1726773049.7182913-9495-249297695255348/source && sleep 0'
  9495 1726773050.06039: stderr chunk (state=2):
>>><<<
  9495 1726773050.06049: stdout chunk (state=2):
>>><<<
  9495 1726773050.06062: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9495 1726773050.06081: variable 'ansible_module_compression' from source: unknown
  9495 1726773050.06133: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED
  9495 1726773050.06152: variable 'ansible_facts' from source: unknown
  9495 1726773050.06215: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773049.7182913-9495-249297695255348/AnsiballZ_copy.py
  9495 1726773050.06306: Sending initial data
  9495 1726773050.06313: Sent initial data (151 bytes)
  9495 1726773050.08923: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpewr8uh7q /root/.ansible/tmp/ansible-tmp-1726773049.7182913-9495-249297695255348/AnsiballZ_copy.py
<<<
  9495 1726773050.10464: stderr chunk (state=3):
>>><<<
  9495 1726773050.10477: stdout chunk (state=3):
>>><<<
  9495 1726773050.10502: done transferring module to remote
  9495 1726773050.10512: _low_level_execute_command(): starting
  9495 1726773050.10517: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773049.7182913-9495-249297695255348/ /root/.ansible/tmp/ansible-tmp-1726773049.7182913-9495-249297695255348/AnsiballZ_copy.py && sleep 0'
  9495 1726773050.13205: stderr chunk (state=2):
>>><<<
  9495 1726773050.13219: stdout chunk (state=2):
>>><<<
  9495 1726773050.13237: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9495 1726773050.13243: _low_level_execute_command(): starting
  9495 1726773050.13249: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773049.7182913-9495-249297695255348/AnsiballZ_copy.py && sleep 0'
  9495 1726773050.29930: stdout chunk (state=2):
>>>
{"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726773049.7182913-9495-249297695255348/source", "md5sum": "9a561d913bcdb5a659ec2dd035975a8e", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 14, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773049.7182913-9495-249297695255348/source", "_original_basename": "tmpj85rtjyo", "follow": false, "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}}
<<<
  9495 1726773050.31151: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  9495 1726773050.31202: stderr chunk (state=3):
>>><<<
  9495 1726773050.31210: stdout chunk (state=3):
>>><<<
  9495 1726773050.31227: _low_level_execute_command() done: rc=0, stdout=
{"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726773049.7182913-9495-249297695255348/source", "md5sum": "9a561d913bcdb5a659ec2dd035975a8e", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 14, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773049.7182913-9495-249297695255348/source", "_original_basename": "tmpj85rtjyo", "follow": false, "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}}
, stderr=Shared connection to 10.31.47.99 closed.
  9495 1726773050.31252: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/active_profile', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726773049.7182913-9495-249297695255348/source', '_original_basename': 'tmpj85rtjyo', 'follow': False, 'checksum': '633f07e1b5698d04352d5dca735869bf2fe77897', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773049.7182913-9495-249297695255348/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  9495 1726773050.31262: _low_level_execute_command(): starting
  9495 1726773050.31269: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773049.7182913-9495-249297695255348/ > /dev/null 2>&1 && sleep 0'
  9495 1726773050.33756: stderr chunk (state=2):
>>><<<
  9495 1726773050.33766: stdout chunk (state=2):
>>><<<
  9495 1726773050.33787: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9495 1726773050.33796: handler run complete
  9495 1726773050.33814: attempt loop complete, returning result
  9495 1726773050.33817: _execute() done
  9495 1726773050.33821: dumping result to json
  9495 1726773050.33826: done dumping result, returning
  9495 1726773050.33834: done running TaskExecutor() for managed_node3/TASK: Ensure kernel_settings is not in active_profile [0affffe7-6841-6cfb-81ae-000000000157]
  9495 1726773050.33840: sending task result for task 0affffe7-6841-6cfb-81ae-000000000157
  9495 1726773050.33872: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000157
  9495 1726773050.33876: WORKER PROCESS EXITING
changed: [managed_node3] => {
    "changed": true,
    "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897",
    "dest": "/etc/tuned/active_profile",
    "gid": 0,
    "group": "root",
    "md5sum": "9a561d913bcdb5a659ec2dd035975a8e",
    "mode": "0600",
    "owner": "root",
    "secontext": "system_u:object_r:tuned_rw_etc_t:s0",
    "size": 14,
    "src": "/root/.ansible/tmp/ansible-tmp-1726773049.7182913-9495-249297695255348/source",
    "state": "file",
    "uid": 0
}
  8303 1726773050.34013: no more pending results, returning what we have
  8303 1726773050.34016: results queue empty
  8303 1726773050.34016: checking for any_errors_fatal
  8303 1726773050.34023: done checking for any_errors_fatal
  8303 1726773050.34024: checking for max_fail_percentage
  8303 1726773050.34025: done checking for max_fail_percentage
  8303 1726773050.34026: checking to see if all hosts have failed and the running result is not ok
  8303 1726773050.34026: done checking to see if all hosts have failed
  8303 1726773050.34027: getting the remaining hosts for this loop
  8303 1726773050.34028: done getting the remaining hosts for this loop
  8303 1726773050.34031: getting the next task for host managed_node3
  8303 1726773050.34036: done getting next task for host managed_node3
  8303 1726773050.34038:  ^ task is: TASK: Set profile_mode to auto
  8303 1726773050.34040:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773050.34043: getting variables
  8303 1726773050.34045: in VariableManager get_vars()
  8303 1726773050.34076: Calling all_inventory to load vars for managed_node3
  8303 1726773050.34079: Calling groups_inventory to load vars for managed_node3
  8303 1726773050.34081: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773050.34091: Calling all_plugins_play to load vars for managed_node3
  8303 1726773050.34094: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773050.34096: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773050.34140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773050.34167: done with get_vars()
  8303 1726773050.34174: done getting variables
  8303 1726773050.34216: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [Set profile_mode to auto] ************************************************
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:57
Thursday 19 September 2024  15:10:50 -0400 (0:00:00.712)       0:00:26.920 **** 
  8303 1726773050.34237: entering _queue_task() for managed_node3/copy
  8303 1726773050.34401: worker is 1 (out of 1 available)
  8303 1726773050.34416: exiting _queue_task() for managed_node3/copy
  8303 1726773050.34428: done queuing things up, now waiting for results queue to drain
  8303 1726773050.34429: waiting for pending results...
  9529 1726773050.34546: running TaskExecutor() for managed_node3/TASK: Set profile_mode to auto
  9529 1726773050.34642: in run() - task 0affffe7-6841-6cfb-81ae-000000000158
  9529 1726773050.34658: variable 'ansible_search_path' from source: unknown
  9529 1726773050.34662: variable 'ansible_search_path' from source: unknown
  9529 1726773050.34694: calling self._execute()
  9529 1726773050.34750: variable 'ansible_host' from source: host vars for 'managed_node3'
  9529 1726773050.34759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9529 1726773050.34771: variable 'omit' from source: magic vars
  9529 1726773050.34848: variable 'omit' from source: magic vars
  9529 1726773050.34883: variable 'omit' from source: magic vars
  9529 1726773050.34906: variable '__kernel_settings_tuned_profile_mode' from source: role '' exported vars
  9529 1726773050.35183: variable '__kernel_settings_tuned_profile_mode' from source: role '' exported vars
  9529 1726773050.35245: variable '__kernel_settings_tuned_dir' from source: role '' exported vars
  9529 1726773050.35276: variable 'omit' from source: magic vars
  9529 1726773050.35310: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  9529 1726773050.35337: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  9529 1726773050.35354: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  9529 1726773050.35372: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9529 1726773050.35383: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9529 1726773050.35408: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  9529 1726773050.35413: variable 'ansible_host' from source: host vars for 'managed_node3'
  9529 1726773050.35417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9529 1726773050.35488: Set connection var ansible_pipelining to False
  9529 1726773050.35498: Set connection var ansible_timeout to 10
  9529 1726773050.35504: Set connection var ansible_module_compression to ZIP_DEFLATED
  9529 1726773050.35509: Set connection var ansible_shell_executable to /bin/sh
  9529 1726773050.35512: Set connection var ansible_connection to ssh
  9529 1726773050.35520: Set connection var ansible_shell_type to sh
  9529 1726773050.35535: variable 'ansible_shell_executable' from source: unknown
  9529 1726773050.35539: variable 'ansible_connection' from source: unknown
  9529 1726773050.35541: variable 'ansible_module_compression' from source: unknown
  9529 1726773050.35543: variable 'ansible_shell_type' from source: unknown
  9529 1726773050.35545: variable 'ansible_shell_executable' from source: unknown
  9529 1726773050.35546: variable 'ansible_host' from source: host vars for 'managed_node3'
  9529 1726773050.35548: variable 'ansible_pipelining' from source: unknown
  9529 1726773050.35550: variable 'ansible_timeout' from source: unknown
  9529 1726773050.35552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9529 1726773050.35661: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  9529 1726773050.35674: variable 'omit' from source: magic vars
  9529 1726773050.35680: starting attempt loop
  9529 1726773050.35684: running the handler
  9529 1726773050.35695: _low_level_execute_command(): starting
  9529 1726773050.35702: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  9529 1726773050.38101: stdout chunk (state=2):
>>>/root
<<<
  9529 1726773050.38225: stderr chunk (state=3):
>>><<<
  9529 1726773050.38232: stdout chunk (state=3):
>>><<<
  9529 1726773050.38253: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  9529 1726773050.38270: _low_level_execute_command(): starting
  9529 1726773050.38278: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773050.3826144-9529-216058296666777 `" && echo ansible-tmp-1726773050.3826144-9529-216058296666777="` echo /root/.ansible/tmp/ansible-tmp-1726773050.3826144-9529-216058296666777 `" ) && sleep 0'
  9529 1726773050.40833: stdout chunk (state=2):
>>>ansible-tmp-1726773050.3826144-9529-216058296666777=/root/.ansible/tmp/ansible-tmp-1726773050.3826144-9529-216058296666777
<<<
  9529 1726773050.40970: stderr chunk (state=3):
>>><<<
  9529 1726773050.40978: stdout chunk (state=3):
>>><<<
  9529 1726773050.40996: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773050.3826144-9529-216058296666777=/root/.ansible/tmp/ansible-tmp-1726773050.3826144-9529-216058296666777
, stderr=
  9529 1726773050.41071: variable 'ansible_module_compression' from source: unknown
  9529 1726773050.41117: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED
  9529 1726773050.41150: variable 'ansible_facts' from source: unknown
  9529 1726773050.41222: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773050.3826144-9529-216058296666777/AnsiballZ_stat.py
  9529 1726773050.41316: Sending initial data
  9529 1726773050.41323: Sent initial data (151 bytes)
  9529 1726773050.44069: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpq0wqumsc /root/.ansible/tmp/ansible-tmp-1726773050.3826144-9529-216058296666777/AnsiballZ_stat.py
<<<
  9529 1726773050.45458: stderr chunk (state=3):
>>><<<
  9529 1726773050.45469: stdout chunk (state=3):
>>><<<
  9529 1726773050.45496: done transferring module to remote
  9529 1726773050.45508: _low_level_execute_command(): starting
  9529 1726773050.45515: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773050.3826144-9529-216058296666777/ /root/.ansible/tmp/ansible-tmp-1726773050.3826144-9529-216058296666777/AnsiballZ_stat.py && sleep 0'
  9529 1726773050.48233: stderr chunk (state=2):
>>><<<
  9529 1726773050.48245: stdout chunk (state=2):
>>><<<
  9529 1726773050.48264: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9529 1726773050.48272: _low_level_execute_command(): starting
  9529 1726773050.48279: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773050.3826144-9529-216058296666777/AnsiballZ_stat.py && sleep 0'
  9529 1726773050.64933: stdout chunk (state=2):
>>>
{"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 505413763, "dev": 51713, "nlink": 1, "atime": 1726773047.1600385, "mtime": 1726773038.3090043, "ctime": 1726773038.3090043, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "911778068", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}}
<<<
  9529 1726773050.66162: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  9529 1726773050.66212: stderr chunk (state=3):
>>><<<
  9529 1726773050.66220: stdout chunk (state=3):
>>><<<
  9529 1726773050.66237: _low_level_execute_command() done: rc=0, stdout=
{"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 505413763, "dev": 51713, "nlink": 1, "atime": 1726773047.1600385, "mtime": 1726773038.3090043, "ctime": 1726773038.3090043, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "911778068", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}}
, stderr=Shared connection to 10.31.47.99 closed.
  9529 1726773050.66292: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773050.3826144-9529-216058296666777/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  9529 1726773050.66382: Sending initial data
  9529 1726773050.66392: Sent initial data (140 bytes)
  9529 1726773050.69056: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmp53eibmse /root/.ansible/tmp/ansible-tmp-1726773050.3826144-9529-216058296666777/source
<<<
  9529 1726773050.69532: stderr chunk (state=3):
>>><<<
  9529 1726773050.69543: stdout chunk (state=3):
>>><<<
  9529 1726773050.69566: _low_level_execute_command(): starting
  9529 1726773050.69573: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773050.3826144-9529-216058296666777/ /root/.ansible/tmp/ansible-tmp-1726773050.3826144-9529-216058296666777/source && sleep 0'
  9529 1726773050.72215: stderr chunk (state=2):
>>><<<
  9529 1726773050.72231: stdout chunk (state=2):
>>><<<
  9529 1726773050.72250: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9529 1726773050.72278: variable 'ansible_module_compression' from source: unknown
  9529 1726773050.72334: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED
  9529 1726773050.72358: variable 'ansible_facts' from source: unknown
  9529 1726773050.72445: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773050.3826144-9529-216058296666777/AnsiballZ_copy.py
  9529 1726773050.72902: Sending initial data
  9529 1726773050.72909: Sent initial data (151 bytes)
  9529 1726773050.75674: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpbb8_8vcz /root/.ansible/tmp/ansible-tmp-1726773050.3826144-9529-216058296666777/AnsiballZ_copy.py
<<<
  9529 1726773050.76901: stderr chunk (state=3):
>>><<<
  9529 1726773050.76912: stdout chunk (state=3):
>>><<<
  9529 1726773050.76933: done transferring module to remote
  9529 1726773050.76942: _low_level_execute_command(): starting
  9529 1726773050.76947: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773050.3826144-9529-216058296666777/ /root/.ansible/tmp/ansible-tmp-1726773050.3826144-9529-216058296666777/AnsiballZ_copy.py && sleep 0'
  9529 1726773050.79463: stderr chunk (state=2):
>>><<<
  9529 1726773050.79475: stdout chunk (state=2):
>>><<<
  9529 1726773050.79493: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9529 1726773050.79498: _low_level_execute_command(): starting
  9529 1726773050.79503: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773050.3826144-9529-216058296666777/AnsiballZ_copy.py && sleep 0'
  9529 1726773050.96204: stdout chunk (state=2):
>>>
{"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726773050.3826144-9529-216058296666777/source", "md5sum": "451e20aff0f489cd2f7d4d73533aa961", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 5, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773050.3826144-9529-216058296666777/source", "_original_basename": "tmp53eibmse", "follow": false, "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}}
<<<
  9529 1726773050.97415: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  9529 1726773050.97470: stderr chunk (state=3):
>>><<<
  9529 1726773050.97478: stdout chunk (state=3):
>>><<<
  9529 1726773050.97497: _low_level_execute_command() done: rc=0, stdout=
{"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726773050.3826144-9529-216058296666777/source", "md5sum": "451e20aff0f489cd2f7d4d73533aa961", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 5, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773050.3826144-9529-216058296666777/source", "_original_basename": "tmp53eibmse", "follow": false, "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}}
, stderr=Shared connection to 10.31.47.99 closed.
  9529 1726773050.97523: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/profile_mode', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726773050.3826144-9529-216058296666777/source', '_original_basename': 'tmp53eibmse', 'follow': False, 'checksum': '43683f4e92c48be4b00ddd86e011a4f27fcdbeb5', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773050.3826144-9529-216058296666777/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  9529 1726773050.97534: _low_level_execute_command(): starting
  9529 1726773050.97541: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773050.3826144-9529-216058296666777/ > /dev/null 2>&1 && sleep 0'
  9529 1726773051.00062: stderr chunk (state=2):
>>><<<
  9529 1726773051.00076: stdout chunk (state=2):
>>><<<
  9529 1726773051.00094: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9529 1726773051.00102: handler run complete
  9529 1726773051.00120: attempt loop complete, returning result
  9529 1726773051.00125: _execute() done
  9529 1726773051.00129: dumping result to json
  9529 1726773051.00135: done dumping result, returning
  9529 1726773051.00144: done running TaskExecutor() for managed_node3/TASK: Set profile_mode to auto [0affffe7-6841-6cfb-81ae-000000000158]
  9529 1726773051.00151: sending task result for task 0affffe7-6841-6cfb-81ae-000000000158
  9529 1726773051.00187: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000158
  9529 1726773051.00191: WORKER PROCESS EXITING
changed: [managed_node3] => {
    "changed": true,
    "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5",
    "dest": "/etc/tuned/profile_mode",
    "gid": 0,
    "group": "root",
    "md5sum": "451e20aff0f489cd2f7d4d73533aa961",
    "mode": "0600",
    "owner": "root",
    "secontext": "system_u:object_r:tuned_etc_t:s0",
    "size": 5,
    "src": "/root/.ansible/tmp/ansible-tmp-1726773050.3826144-9529-216058296666777/source",
    "state": "file",
    "uid": 0
}
  8303 1726773051.00431: no more pending results, returning what we have
  8303 1726773051.00434: results queue empty
  8303 1726773051.00434: checking for any_errors_fatal
  8303 1726773051.00439: done checking for any_errors_fatal
  8303 1726773051.00440: checking for max_fail_percentage
  8303 1726773051.00441: done checking for max_fail_percentage
  8303 1726773051.00441: checking to see if all hosts have failed and the running result is not ok
  8303 1726773051.00442: done checking to see if all hosts have failed
  8303 1726773051.00442: getting the remaining hosts for this loop
  8303 1726773051.00443: done getting the remaining hosts for this loop
  8303 1726773051.00446: getting the next task for host managed_node3
  8303 1726773051.00450: done getting next task for host managed_node3
  8303 1726773051.00452:  ^ task is: TASK: Restart tuned
  8303 1726773051.00453:  ^ state is: HOST STATE: block=2, task=5, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False
  8303 1726773051.00455: getting variables
  8303 1726773051.00456: in VariableManager get_vars()
  8303 1726773051.00482: Calling all_inventory to load vars for managed_node3
  8303 1726773051.00487: Calling groups_inventory to load vars for managed_node3
  8303 1726773051.00489: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773051.00497: Calling all_plugins_play to load vars for managed_node3
  8303 1726773051.00499: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773051.00500: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773051.00535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773051.00560: done with get_vars()
  8303 1726773051.00566: done getting variables
  8303 1726773051.00615: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True)

TASK [Restart tuned] ***********************************************************
task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:64
Thursday 19 September 2024  15:10:51 -0400 (0:00:00.663)       0:00:27.584 **** 
  8303 1726773051.00636: entering _queue_task() for managed_node3/service
  8303 1726773051.00810: worker is 1 (out of 1 available)
  8303 1726773051.00824: exiting _queue_task() for managed_node3/service
  8303 1726773051.00836: done queuing things up, now waiting for results queue to drain
  8303 1726773051.00838: waiting for pending results...
  9567 1726773051.00951: running TaskExecutor() for managed_node3/TASK: Restart tuned
  9567 1726773051.01050: in run() - task 0affffe7-6841-6cfb-81ae-000000000159
  9567 1726773051.01067: variable 'ansible_search_path' from source: unknown
  9567 1726773051.01072: variable 'ansible_search_path' from source: unknown
  9567 1726773051.01107: variable '__kernel_settings_services' from source: include_vars
  9567 1726773051.01417: variable '__kernel_settings_services' from source: include_vars
  9567 1726773051.01475: variable 'omit' from source: magic vars
  9567 1726773051.01548: variable 'ansible_host' from source: host vars for 'managed_node3'
  9567 1726773051.01558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9567 1726773051.01569: variable 'omit' from source: magic vars
  9567 1726773051.01624: variable 'omit' from source: magic vars
  9567 1726773051.01651: variable 'omit' from source: magic vars
  9567 1726773051.01680: variable 'item' from source: unknown
  9567 1726773051.01735: variable 'item' from source: unknown
  9567 1726773051.01756: variable 'omit' from source: magic vars
  9567 1726773051.01793: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection
  9567 1726773051.01819: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False)
  9567 1726773051.01837: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell
  9567 1726773051.01849: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9567 1726773051.01858: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False)
  9567 1726773051.01880: variable 'inventory_hostname' from source: host vars for 'managed_node3'
  9567 1726773051.01884: variable 'ansible_host' from source: host vars for 'managed_node3'
  9567 1726773051.01900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9567 1726773051.01972: Set connection var ansible_pipelining to False
  9567 1726773051.01983: Set connection var ansible_timeout to 10
  9567 1726773051.01990: Set connection var ansible_module_compression to ZIP_DEFLATED
  9567 1726773051.01996: Set connection var ansible_shell_executable to /bin/sh
  9567 1726773051.01999: Set connection var ansible_connection to ssh
  9567 1726773051.02005: Set connection var ansible_shell_type to sh
  9567 1726773051.02019: variable 'ansible_shell_executable' from source: unknown
  9567 1726773051.02023: variable 'ansible_connection' from source: unknown
  9567 1726773051.02026: variable 'ansible_module_compression' from source: unknown
  9567 1726773051.02029: variable 'ansible_shell_type' from source: unknown
  9567 1726773051.02033: variable 'ansible_shell_executable' from source: unknown
  9567 1726773051.02036: variable 'ansible_host' from source: host vars for 'managed_node3'
  9567 1726773051.02041: variable 'ansible_pipelining' from source: unknown
  9567 1726773051.02045: variable 'ansible_timeout' from source: unknown
  9567 1726773051.02048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3'
  9567 1726773051.02135: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False)
  9567 1726773051.02144: variable 'omit' from source: magic vars
  9567 1726773051.02149: starting attempt loop
  9567 1726773051.02153: running the handler
  9567 1726773051.02223: variable 'ansible_facts' from source: unknown
  9567 1726773051.02254: _low_level_execute_command(): starting
  9567 1726773051.02263: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0'
  9567 1726773051.04619: stdout chunk (state=2):
>>>/root
<<<
  9567 1726773051.04742: stderr chunk (state=3):
>>><<<
  9567 1726773051.04751: stdout chunk (state=3):
>>><<<
  9567 1726773051.04774: _low_level_execute_command() done: rc=0, stdout=/root
, stderr=
  9567 1726773051.04791: _low_level_execute_command(): starting
  9567 1726773051.04797: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773051.047829-9567-115937499118733 `" && echo ansible-tmp-1726773051.047829-9567-115937499118733="` echo /root/.ansible/tmp/ansible-tmp-1726773051.047829-9567-115937499118733 `" ) && sleep 0'
  9567 1726773051.07307: stdout chunk (state=2):
>>>ansible-tmp-1726773051.047829-9567-115937499118733=/root/.ansible/tmp/ansible-tmp-1726773051.047829-9567-115937499118733
<<<
  9567 1726773051.07446: stderr chunk (state=3):
>>><<<
  9567 1726773051.07454: stdout chunk (state=3):
>>><<<
  9567 1726773051.07472: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773051.047829-9567-115937499118733=/root/.ansible/tmp/ansible-tmp-1726773051.047829-9567-115937499118733
, stderr=
  9567 1726773051.07500: variable 'ansible_module_compression' from source: unknown
  9567 1726773051.07546: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED
  9567 1726773051.07601: variable 'ansible_facts' from source: unknown
  9567 1726773051.07758: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773051.047829-9567-115937499118733/AnsiballZ_setup.py
  9567 1726773051.07878: Sending initial data
  9567 1726773051.07888: Sent initial data (151 bytes)
  9567 1726773051.10503: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpuiq4g101 /root/.ansible/tmp/ansible-tmp-1726773051.047829-9567-115937499118733/AnsiballZ_setup.py
<<<
  9567 1726773051.12675: stderr chunk (state=3):
>>><<<
  9567 1726773051.12689: stdout chunk (state=3):
>>><<<
  9567 1726773051.12716: done transferring module to remote
  9567 1726773051.12730: _low_level_execute_command(): starting
  9567 1726773051.12736: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773051.047829-9567-115937499118733/ /root/.ansible/tmp/ansible-tmp-1726773051.047829-9567-115937499118733/AnsiballZ_setup.py && sleep 0'
  9567 1726773051.15280: stderr chunk (state=2):
>>><<<
  9567 1726773051.15292: stdout chunk (state=2):
>>><<<
  9567 1726773051.15310: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9567 1726773051.15315: _low_level_execute_command(): starting
  9567 1726773051.15320: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773051.047829-9567-115937499118733/AnsiballZ_setup.py && sleep 0'
  9567 1726773051.44128: stdout chunk (state=2):
>>>
{"ansible_facts": {"ansible_service_mgr": "systemd"}, "invocation": {"module_args": {"gather_subset": ["!all"], "filter": ["ansible_service_mgr"], "gather_timeout": 10, "fact_path": "/etc/ansible/facts.d"}}}
<<<
  9567 1726773051.45831: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  9567 1726773051.45883: stderr chunk (state=3):
>>><<<
  9567 1726773051.45894: stdout chunk (state=3):
>>><<<
  9567 1726773051.45910: _low_level_execute_command() done: rc=0, stdout=
{"ansible_facts": {"ansible_service_mgr": "systemd"}, "invocation": {"module_args": {"gather_subset": ["!all"], "filter": ["ansible_service_mgr"], "gather_timeout": 10, "fact_path": "/etc/ansible/facts.d"}}}
, stderr=Shared connection to 10.31.47.99 closed.
  9567 1726773051.45936: done with _execute_module (ansible.legacy.setup, {'gather_subset': '!all', 'filter': 'ansible_service_mgr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773051.047829-9567-115937499118733/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  9567 1726773051.45955: Facts {'ansible_facts': {'ansible_service_mgr': 'systemd'}, 'invocation': {'module_args': {'gather_subset': ['!all'], 'filter': ['ansible_service_mgr'], 'gather_timeout': 10, 'fact_path': '/etc/ansible/facts.d'}}, '_ansible_parsed': True}
  9567 1726773051.46015: variable 'ansible_module_compression' from source: unknown
  9567 1726773051.46050: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-830370j1pdog/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED
  9567 1726773051.46103: variable 'ansible_facts' from source: unknown
  9567 1726773051.46258: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773051.047829-9567-115937499118733/AnsiballZ_systemd.py
  9567 1726773051.46364: Sending initial data
  9567 1726773051.46375: Sent initial data (153 bytes)
  9567 1726773051.49469: stdout chunk (state=3):
>>>sftp> put /root/.ansible/tmp/ansible-local-830370j1pdog/tmpp3510ozr /root/.ansible/tmp/ansible-tmp-1726773051.047829-9567-115937499118733/AnsiballZ_systemd.py
<<<
  9567 1726773051.51772: stderr chunk (state=3):
>>><<<
  9567 1726773051.51782: stdout chunk (state=3):
>>><<<
  9567 1726773051.51807: done transferring module to remote
  9567 1726773051.51817: _low_level_execute_command(): starting
  9567 1726773051.51822: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773051.047829-9567-115937499118733/ /root/.ansible/tmp/ansible-tmp-1726773051.047829-9567-115937499118733/AnsiballZ_systemd.py && sleep 0'
  9567 1726773051.54473: stderr chunk (state=2):
>>><<<
  9567 1726773051.54487: stdout chunk (state=2):
>>><<<
  9567 1726773051.54505: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9567 1726773051.54510: _low_level_execute_command(): starting
  9567 1726773051.54516: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773051.047829-9567-115937499118733/AnsiballZ_systemd.py && sleep 0'
  9567 1726773051.83176: stdout chunk (state=2):
>>>
{"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:38 EDT", "WatchdogTimestampMonotonic": "453344532", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "9802", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ExecMainStartTimestampMonotonic": "453204995", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "9802", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:38 EDT] ; stop_time=[n/a] ; pid=9802 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "17055744", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "power-profiles-daemon.service auto-cpufreq.service tlp.service cpupower.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "systemd-journald.socket polkit.service systemd-sysctl.service dbus.socket basic.target system.slice network.target dbus.service sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:38 EDT", "StateChangeTimestampMonotonic": "453344536", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:38 EDT", "InactiveExitTimestampMonotonic": "453205057", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ActiveEnterTimestampMonotonic": "453344536", "ActiveExitTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ActiveExitTimestampMonotonic": "453097312", "InactiveEnterTimestamp": "Thu 2024-09-19 15:10:38 EDT", "InactiveEnterTimestampMonotonic": "453201635", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ConditionTimestampMonotonic": "453202685", "AssertTimestamp": "Thu 2024-09-19 15:10:38 EDT", "AssertTimestampMonotonic": "453202686", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "29d42365ee9e42d1916b9ebf15b9284e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}}
<<<
  9567 1726773051.84837: stderr chunk (state=3):
>>>Shared connection to 10.31.47.99 closed.
<<<
  9567 1726773051.84888: stderr chunk (state=3):
>>><<<
  9567 1726773051.84897: stdout chunk (state=3):
>>><<<
  9567 1726773051.84916: _low_level_execute_command() done: rc=0, stdout=
{"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:38 EDT", "WatchdogTimestampMonotonic": "453344532", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "9802", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ExecMainStartTimestampMonotonic": "453204995", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "9802", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:38 EDT] ; stop_time=[n/a] ; pid=9802 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "17055744", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "power-profiles-daemon.service auto-cpufreq.service tlp.service cpupower.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "systemd-journald.socket polkit.service systemd-sysctl.service dbus.socket basic.target system.slice network.target dbus.service sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:38 EDT", "StateChangeTimestampMonotonic": "453344536", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:38 EDT", "InactiveExitTimestampMonotonic": "453205057", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ActiveEnterTimestampMonotonic": "453344536", "ActiveExitTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ActiveExitTimestampMonotonic": "453097312", "InactiveEnterTimestamp": "Thu 2024-09-19 15:10:38 EDT", "InactiveEnterTimestampMonotonic": "453201635", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:38 EDT", "ConditionTimestampMonotonic": "453202685", "AssertTimestamp": "Thu 2024-09-19 15:10:38 EDT", "AssertTimestampMonotonic": "453202686", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "29d42365ee9e42d1916b9ebf15b9284e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}}
, stderr=Shared connection to 10.31.47.99 closed.
  9567 1726773051.85036: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773051.047829-9567-115937499118733/', '_ansible_remote_tmp': '~/.ansible/tmp'})
  9567 1726773051.85057: _low_level_execute_command(): starting
  9567 1726773051.85064: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773051.047829-9567-115937499118733/ > /dev/null 2>&1 && sleep 0'
  9567 1726773051.87681: stderr chunk (state=2):
>>><<<
  9567 1726773051.87693: stdout chunk (state=2):
>>><<<
  9567 1726773051.87711: _low_level_execute_command() done: rc=0, stdout=, stderr=
  9567 1726773051.87719: handler run complete
  9567 1726773051.87752: attempt loop complete, returning result
  9567 1726773051.87771: variable 'item' from source: unknown
  9567 1726773051.87838: variable 'item' from source: unknown
ok: [managed_node3] => (item=tuned) => {
    "ansible_loop_var": "item",
    "changed": false,
    "enabled": true,
    "item": "tuned",
    "name": "tuned",
    "state": "started",
    "status": {
        "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:38 EDT",
        "ActiveEnterTimestampMonotonic": "453344536",
        "ActiveExitTimestamp": "Thu 2024-09-19 15:10:38 EDT",
        "ActiveExitTimestampMonotonic": "453097312",
        "ActiveState": "active",
        "After": "systemd-journald.socket polkit.service systemd-sysctl.service dbus.socket basic.target system.slice network.target dbus.service sysinit.target",
        "AllowIsolate": "no",
        "AllowedCPUs": "",
        "AllowedMemoryNodes": "",
        "AmbientCapabilities": "",
        "AssertResult": "yes",
        "AssertTimestamp": "Thu 2024-09-19 15:10:38 EDT",
        "AssertTimestampMonotonic": "453202686",
        "Before": "shutdown.target multi-user.target",
        "BlockIOAccounting": "no",
        "BlockIOWeight": "[not set]",
        "BusName": "com.redhat.tuned",
        "CPUAccounting": "no",
        "CPUAffinity": "",
        "CPUAffinityFromNUMA": "no",
        "CPUQuotaPerSecUSec": "infinity",
        "CPUQuotaPeriodUSec": "infinity",
        "CPUSchedulingPolicy": "0",
        "CPUSchedulingPriority": "0",
        "CPUSchedulingResetOnFork": "no",
        "CPUShares": "[not set]",
        "CPUUsageNSec": "[not set]",
        "CPUWeight": "[not set]",
        "CacheDirectoryMode": "0755",
        "CanFreeze": "yes",
        "CanIsolate": "no",
        "CanReload": "no",
        "CanStart": "yes",
        "CanStop": "yes",
        "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf",
        "CollectMode": "inactive",
        "ConditionResult": "yes",
        "ConditionTimestamp": "Thu 2024-09-19 15:10:38 EDT",
        "ConditionTimestampMonotonic": "453202685",
        "ConfigurationDirectoryMode": "0755",
        "Conflicts": "power-profiles-daemon.service auto-cpufreq.service tlp.service cpupower.service shutdown.target",
        "ControlGroup": "/system.slice/tuned.service",
        "ControlPID": "0",
        "DefaultDependencies": "yes",
        "DefaultMemoryLow": "0",
        "DefaultMemoryMin": "0",
        "Delegate": "no",
        "Description": "Dynamic System Tuning Daemon",
        "DevicePolicy": "auto",
        "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)",
        "DynamicUser": "no",
        "EffectiveCPUs": "",
        "EffectiveMemoryNodes": "",
        "ExecMainCode": "0",
        "ExecMainExitTimestampMonotonic": "0",
        "ExecMainPID": "9802",
        "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:38 EDT",
        "ExecMainStartTimestampMonotonic": "453204995",
        "ExecMainStatus": "0",
        "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:38 EDT] ; stop_time=[n/a] ; pid=9802 ; code=(null) ; status=0/0 }",
        "FailureAction": "none",
        "FileDescriptorStoreMax": "0",
        "FragmentPath": "/usr/lib/systemd/system/tuned.service",
        "FreezerState": "running",
        "GID": "[not set]",
        "GuessMainPID": "yes",
        "IOAccounting": "no",
        "IOSchedulingClass": "0",
        "IOSchedulingPriority": "0",
        "IOWeight": "[not set]",
        "IPAccounting": "no",
        "IPEgressBytes": "18446744073709551615",
        "IPEgressPackets": "18446744073709551615",
        "IPIngressBytes": "18446744073709551615",
        "IPIngressPackets": "18446744073709551615",
        "Id": "tuned.service",
        "IgnoreOnIsolate": "no",
        "IgnoreSIGPIPE": "yes",
        "InactiveEnterTimestamp": "Thu 2024-09-19 15:10:38 EDT",
        "InactiveEnterTimestampMonotonic": "453201635",
        "InactiveExitTimestamp": "Thu 2024-09-19 15:10:38 EDT",
        "InactiveExitTimestampMonotonic": "453205057",
        "InvocationID": "29d42365ee9e42d1916b9ebf15b9284e",
        "JobRunningTimeoutUSec": "infinity",
        "JobTimeoutAction": "none",
        "JobTimeoutUSec": "infinity",
        "KeyringMode": "private",
        "KillMode": "control-group",
        "KillSignal": "15",
        "LimitAS": "infinity",
        "LimitASSoft": "infinity",
        "LimitCORE": "infinity",
        "LimitCORESoft": "0",
        "LimitCPU": "infinity",
        "LimitCPUSoft": "infinity",
        "LimitDATA": "infinity",
        "LimitDATASoft": "infinity",
        "LimitFSIZE": "infinity",
        "LimitFSIZESoft": "infinity",
        "LimitLOCKS": "infinity",
        "LimitLOCKSSoft": "infinity",
        "LimitMEMLOCK": "65536",
        "LimitMEMLOCKSoft": "65536",
        "LimitMSGQUEUE": "819200",
        "LimitMSGQUEUESoft": "819200",
        "LimitNICE": "0",
        "LimitNICESoft": "0",
        "LimitNOFILE": "262144",
        "LimitNOFILESoft": "1024",
        "LimitNPROC": "14003",
        "LimitNPROCSoft": "14003",
        "LimitRSS": "infinity",
        "LimitRSSSoft": "infinity",
        "LimitRTPRIO": "0",
        "LimitRTPRIOSoft": "0",
        "LimitRTTIME": "infinity",
        "LimitRTTIMESoft": "infinity",
        "LimitSIGPENDING": "14003",
        "LimitSIGPENDINGSoft": "14003",
        "LimitSTACK": "infinity",
        "LimitSTACKSoft": "8388608",
        "LoadState": "loaded",
        "LockPersonality": "no",
        "LogLevelMax": "-1",
        "LogRateLimitBurst": "0",
        "LogRateLimitIntervalUSec": "0",
        "LogsDirectoryMode": "0755",
        "MainPID": "9802",
        "MemoryAccounting": "yes",
        "MemoryCurrent": "17055744",
        "MemoryDenyWriteExecute": "no",
        "MemoryHigh": "infinity",
        "MemoryLimit": "infinity",
        "MemoryLow": "0",
        "MemoryMax": "infinity",
        "MemoryMin": "0",
        "MemorySwapMax": "infinity",
        "MountAPIVFS": "no",
        "MountFlags": "",
        "NFileDescriptorStore": "0",
        "NRestarts": "0",
        "NUMAMask": "",
        "NUMAPolicy": "n/a",
        "Names": "tuned.service",
        "NeedDaemonReload": "no",
        "Nice": "0",
        "NoNewPrivileges": "no",
        "NonBlocking": "no",
        "NotifyAccess": "none",
        "OOMScoreAdjust": "0",
        "OnFailureJobMode": "replace",
        "PIDFile": "/run/tuned/tuned.pid",
        "PermissionsStartOnly": "no",
        "Perpetual": "no",
        "PrivateDevices": "no",
        "PrivateMounts": "no",
        "PrivateNetwork": "no",
        "PrivateTmp": "no",
        "PrivateUsers": "no",
        "ProtectControlGroups": "no",
        "ProtectHome": "no",
        "ProtectKernelModules": "no",
        "ProtectKernelTunables": "no",
        "ProtectSystem": "no",
        "RefuseManualStart": "no",
        "RefuseManualStop": "no",
        "RemainAfterExit": "no",
        "RemoveIPC": "no",
        "Requires": "system.slice sysinit.target dbus.service dbus.socket",
        "Restart": "no",
        "RestartUSec": "100ms",
        "RestrictNamespaces": "no",
        "RestrictRealtime": "no",
        "RestrictSUIDSGID": "no",
        "Result": "success",
        "RootDirectoryStartOnly": "no",
        "RuntimeDirectoryMode": "0755",
        "RuntimeDirectoryPreserve": "no",
        "RuntimeMaxUSec": "infinity",
        "SameProcessGroup": "no",
        "SecureBits": "0",
        "SendSIGHUP": "no",
        "SendSIGKILL": "yes",
        "Slice": "system.slice",
        "StandardError": "inherit",
        "StandardInput": "null",
        "StandardInputData": "",
        "StandardOutput": "journal",
        "StartLimitAction": "none",
        "StartLimitBurst": "5",
        "StartLimitIntervalUSec": "10s",
        "StartupBlockIOWeight": "[not set]",
        "StartupCPUShares": "[not set]",
        "StartupCPUWeight": "[not set]",
        "StartupIOWeight": "[not set]",
        "StateChangeTimestamp": "Thu 2024-09-19 15:10:38 EDT",
        "StateChangeTimestampMonotonic": "453344536",
        "StateDirectoryMode": "0755",
        "StatusErrno": "0",
        "StopWhenUnneeded": "no",
        "SubState": "running",
        "SuccessAction": "none",
        "SyslogFacility": "3",
        "SyslogLevel": "6",
        "SyslogLevelPrefix": "yes",
        "SyslogPriority": "30",
        "SystemCallErrorNumber": "0",
        "TTYReset": "no",
        "TTYVHangup": "no",
        "TTYVTDisallocate": "no",
        "TasksAccounting": "yes",
        "TasksCurrent": "4",
        "TasksMax": "22405",
        "TimeoutStartUSec": "1min 30s",
        "TimeoutStopUSec": "1min 30s",
        "TimerSlackNSec": "50000",
        "Transient": "no",
        "Type": "dbus",
        "UID": "[not set]",
        "UMask": "0022",
        "UnitFilePreset": "enabled",
        "UnitFileState": "enabled",
        "UtmpMode": "init",
        "WantedBy": "multi-user.target",
        "WatchdogTimestamp": "Thu 2024-09-19 15:10:38 EDT",
        "WatchdogTimestampMonotonic": "453344532",
        "WatchdogUSec": "0"
    }
}
  9567 1726773051.87937: dumping result to json
  9567 1726773051.87959: done dumping result, returning
  9567 1726773051.87969: done running TaskExecutor() for managed_node3/TASK: Restart tuned [0affffe7-6841-6cfb-81ae-000000000159]
  9567 1726773051.87975: sending task result for task 0affffe7-6841-6cfb-81ae-000000000159
  9567 1726773051.88089: done sending task result for task 0affffe7-6841-6cfb-81ae-000000000159
  9567 1726773051.88094: WORKER PROCESS EXITING
  8303 1726773051.88456: no more pending results, returning what we have
  8303 1726773051.88458: results queue empty
  8303 1726773051.88458: checking for any_errors_fatal
  8303 1726773051.88461: done checking for any_errors_fatal
  8303 1726773051.88462: checking for max_fail_percentage
  8303 1726773051.88463: done checking for max_fail_percentage
  8303 1726773051.88463: checking to see if all hosts have failed and the running result is not ok
  8303 1726773051.88463: done checking to see if all hosts have failed
  8303 1726773051.88464: getting the remaining hosts for this loop
  8303 1726773051.88464: done getting the remaining hosts for this loop
  8303 1726773051.88466: getting the next task for host managed_node3
  8303 1726773051.88472: done getting next task for host managed_node3
  8303 1726773051.88473:  ^ task is: TASK: meta (flush_handlers)
  8303 1726773051.88474:  ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773051.88477: getting variables
  8303 1726773051.88478: in VariableManager get_vars()
  8303 1726773051.88500: Calling all_inventory to load vars for managed_node3
  8303 1726773051.88502: Calling groups_inventory to load vars for managed_node3
  8303 1726773051.88503: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773051.88510: Calling all_plugins_play to load vars for managed_node3
  8303 1726773051.88512: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773051.88514: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773051.88548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773051.88568: done with get_vars()
  8303 1726773051.88573: done getting variables
  8303 1726773051.88622: in VariableManager get_vars()
  8303 1726773051.88630: Calling all_inventory to load vars for managed_node3
  8303 1726773051.88631: Calling groups_inventory to load vars for managed_node3
  8303 1726773051.88633: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773051.88635: Calling all_plugins_play to load vars for managed_node3
  8303 1726773051.88637: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773051.88638: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773051.88658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773051.88675: done with get_vars()
  8303 1726773051.88682: done queuing things up, now waiting for results queue to drain
  8303 1726773051.88683: results queue empty
  8303 1726773051.88683: checking for any_errors_fatal
  8303 1726773051.88691: done checking for any_errors_fatal
  8303 1726773051.88692: checking for max_fail_percentage
  8303 1726773051.88692: done checking for max_fail_percentage
  8303 1726773051.88693: checking to see if all hosts have failed and the running result is not ok
  8303 1726773051.88693: done checking to see if all hosts have failed
  8303 1726773051.88694: getting the remaining hosts for this loop
  8303 1726773051.88695: done getting the remaining hosts for this loop
  8303 1726773051.88697: getting the next task for host managed_node3
  8303 1726773051.88700: done getting next task for host managed_node3
  8303 1726773051.88701:  ^ task is: TASK: meta (flush_handlers)
  8303 1726773051.88702:  ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773051.88705: getting variables
  8303 1726773051.88706: in VariableManager get_vars()
  8303 1726773051.88715: Calling all_inventory to load vars for managed_node3
  8303 1726773051.88716: Calling groups_inventory to load vars for managed_node3
  8303 1726773051.88718: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773051.88722: Calling all_plugins_play to load vars for managed_node3
  8303 1726773051.88724: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773051.88726: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773051.88755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773051.88778: done with get_vars()
  8303 1726773051.88784: done getting variables
  8303 1726773051.88823: in VariableManager get_vars()
  8303 1726773051.88833: Calling all_inventory to load vars for managed_node3
  8303 1726773051.88835: Calling groups_inventory to load vars for managed_node3
  8303 1726773051.88836: Calling all_plugins_inventory to load vars for managed_node3
  8303 1726773051.88840: Calling all_plugins_play to load vars for managed_node3
  8303 1726773051.88842: Calling groups_plugins_inventory to load vars for managed_node3
  8303 1726773051.88844: Calling groups_plugins_play to load vars for managed_node3
  8303 1726773051.88874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name
  8303 1726773051.88901: done with get_vars()
  8303 1726773051.88910: done queuing things up, now waiting for results queue to drain
  8303 1726773051.88912: results queue empty
  8303 1726773051.88912: checking for any_errors_fatal
  8303 1726773051.88914: done checking for any_errors_fatal
  8303 1726773051.88915: checking for max_fail_percentage
  8303 1726773051.88916: done checking for max_fail_percentage
  8303 1726773051.88916: checking to see if all hosts have failed and the running result is not ok
  8303 1726773051.88917: done checking to see if all hosts have failed
  8303 1726773051.88917: getting the remaining hosts for this loop
  8303 1726773051.88918: done getting the remaining hosts for this loop
  8303 1726773051.88920: getting the next task for host managed_node3
  8303 1726773051.88922: done getting next task for host managed_node3
  8303 1726773051.88923:  ^ task is: None
  8303 1726773051.88924:  ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False
  8303 1726773051.88925: done queuing things up, now waiting for results queue to drain
  8303 1726773051.88926: results queue empty
  8303 1726773051.88926: checking for any_errors_fatal
  8303 1726773051.88927: done checking for any_errors_fatal
  8303 1726773051.88927: checking for max_fail_percentage
  8303 1726773051.88928: done checking for max_fail_percentage
  8303 1726773051.88928: checking to see if all hosts have failed and the running result is not ok
  8303 1726773051.88929: done checking to see if all hosts have failed
  8303 1726773051.88931: getting the next task for host managed_node3
  8303 1726773051.88933: done getting next task for host managed_node3
  8303 1726773051.88933:  ^ task is: None
  8303 1726773051.88934:  ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False

PLAY RECAP *********************************************************************
managed_node3              : ok=48   changed=8    unreachable=0    failed=0    skipped=19   rescued=0    ignored=0   

Thursday 19 September 2024  15:10:51 -0400 (0:00:00.883)       0:00:28.468 **** 
=============================================================================== 
fedora.linux_system_roles.kernel_settings : Ensure required packages are installed --- 5.59s
/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 
fedora.linux_system_roles.kernel_settings : Ensure required packages are installed --- 3.35s
/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 
fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started --- 1.26s
/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 
fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes --- 1.22s
/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 
fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role --- 0.94s
/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:2 
fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started --- 0.93s
/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 
Restart tuned ----------------------------------------------------------- 0.88s
/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:64 
fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory --- 0.79s
/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 
fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile --- 0.76s
/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 
fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory --- 0.74s
/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 
fedora.linux_system_roles.kernel_settings : Apply kernel settings ------- 0.73s
/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 
Ensure kernel_settings is not in active_profile ------------------------- 0.71s
/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:46 
fedora.linux_system_roles.kernel_settings : Set profile_mode to manual --- 0.69s
/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 
Set profile_mode to auto ------------------------------------------------ 0.66s
/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:57 
fedora.linux_system_roles.kernel_settings : Apply kernel settings ------- 0.64s
/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 
fedora.linux_system_roles.kernel_settings : Set profile_mode to manual --- 0.60s
/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 
fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile --- 0.59s
/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 
fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly --- 0.58s
/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:2 
fedora.linux_system_roles.kernel_settings : Read tuned main config ------ 0.50s
/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42 
fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists --- 0.47s
/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74 
  8303 1726773051.89033: RUNNING CLEANUP